Oct 11 10:38:43.305289 master-0 systemd[1]: Starting Kubernetes Kubelet... Oct 11 10:38:43.943081 master-0 kubenswrapper[4790]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 10:38:43.943081 master-0 kubenswrapper[4790]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 11 10:38:43.943081 master-0 kubenswrapper[4790]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 10:38:43.943081 master-0 kubenswrapper[4790]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 10:38:43.943081 master-0 kubenswrapper[4790]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 11 10:38:43.943081 master-0 kubenswrapper[4790]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 10:38:43.945285 master-0 kubenswrapper[4790]: I1011 10:38:43.944067 4790 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 11 10:38:43.948423 master-0 kubenswrapper[4790]: W1011 10:38:43.948383 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 10:38:43.948423 master-0 kubenswrapper[4790]: W1011 10:38:43.948411 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 10:38:43.948423 master-0 kubenswrapper[4790]: W1011 10:38:43.948420 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 10:38:43.948423 master-0 kubenswrapper[4790]: W1011 10:38:43.948427 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948435 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948443 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948450 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948456 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948462 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948470 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948476 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948489 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948496 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948502 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948508 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948515 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948521 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948527 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948533 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948540 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948547 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948553 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948560 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 10:38:43.948556 master-0 kubenswrapper[4790]: W1011 10:38:43.948566 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948573 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948579 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948586 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948592 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948598 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948605 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948612 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948619 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948625 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948634 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948643 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948651 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948659 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948666 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948674 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948681 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948689 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948696 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 10:38:43.949103 master-0 kubenswrapper[4790]: W1011 10:38:43.948728 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948737 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948744 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948751 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948758 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948766 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948774 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948779 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948784 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948792 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948797 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948803 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948808 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948813 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948824 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948830 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948836 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948843 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948854 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948866 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 10:38:43.949540 master-0 kubenswrapper[4790]: W1011 10:38:43.948873 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948884 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948899 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948905 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948913 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948920 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948927 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948934 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948940 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: W1011 10:38:43.948946 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949782 4790 flags.go:64] FLAG: --address="0.0.0.0" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949804 4790 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949818 4790 flags.go:64] FLAG: --anonymous-auth="true" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949828 4790 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949837 4790 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949845 4790 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949854 4790 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949862 4790 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949869 4790 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949875 4790 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949882 4790 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 11 10:38:43.949979 master-0 kubenswrapper[4790]: I1011 10:38:43.949890 4790 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949896 4790 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949902 4790 flags.go:64] FLAG: --cgroup-root="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949908 4790 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949915 4790 flags.go:64] FLAG: --client-ca-file="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949922 4790 flags.go:64] FLAG: --cloud-config="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949927 4790 flags.go:64] FLAG: --cloud-provider="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949933 4790 flags.go:64] FLAG: --cluster-dns="[]" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949942 4790 flags.go:64] FLAG: --cluster-domain="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949948 4790 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949954 4790 flags.go:64] FLAG: --config-dir="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949962 4790 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949968 4790 flags.go:64] FLAG: --container-log-max-files="5" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949978 4790 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949986 4790 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.949993 4790 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950001 4790 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950009 4790 flags.go:64] FLAG: --contention-profiling="false" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950017 4790 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950025 4790 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950033 4790 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950041 4790 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950052 4790 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950060 4790 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950067 4790 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 11 10:38:43.950579 master-0 kubenswrapper[4790]: I1011 10:38:43.950075 4790 flags.go:64] FLAG: --enable-load-reader="false" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950083 4790 flags.go:64] FLAG: --enable-server="true" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950091 4790 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950103 4790 flags.go:64] FLAG: --event-burst="100" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950110 4790 flags.go:64] FLAG: --event-qps="50" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950116 4790 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950123 4790 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950131 4790 flags.go:64] FLAG: --eviction-hard="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950141 4790 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950149 4790 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950156 4790 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950166 4790 flags.go:64] FLAG: --eviction-soft="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950174 4790 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950182 4790 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950189 4790 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950197 4790 flags.go:64] FLAG: --experimental-mounter-path="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950205 4790 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950213 4790 flags.go:64] FLAG: --fail-swap-on="true" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950220 4790 flags.go:64] FLAG: --feature-gates="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950230 4790 flags.go:64] FLAG: --file-check-frequency="20s" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950241 4790 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950251 4790 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950260 4790 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950271 4790 flags.go:64] FLAG: --healthz-port="10248" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950280 4790 flags.go:64] FLAG: --help="false" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950288 4790 flags.go:64] FLAG: --hostname-override="" Oct 11 10:38:43.951121 master-0 kubenswrapper[4790]: I1011 10:38:43.950295 4790 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950304 4790 flags.go:64] FLAG: --http-check-frequency="20s" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950312 4790 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950319 4790 flags.go:64] FLAG: --image-credential-provider-config="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950327 4790 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950335 4790 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950343 4790 flags.go:64] FLAG: --image-service-endpoint="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950351 4790 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950360 4790 flags.go:64] FLAG: --kube-api-burst="100" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950367 4790 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950374 4790 flags.go:64] FLAG: --kube-api-qps="50" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950381 4790 flags.go:64] FLAG: --kube-reserved="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950387 4790 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950394 4790 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950400 4790 flags.go:64] FLAG: --kubelet-cgroups="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950406 4790 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950412 4790 flags.go:64] FLAG: --lock-file="" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950418 4790 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950425 4790 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950431 4790 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950442 4790 flags.go:64] FLAG: --log-json-split-stream="false" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950450 4790 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950456 4790 flags.go:64] FLAG: --log-text-split-stream="false" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950462 4790 flags.go:64] FLAG: --logging-format="text" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950468 4790 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 11 10:38:43.951678 master-0 kubenswrapper[4790]: I1011 10:38:43.950474 4790 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950483 4790 flags.go:64] FLAG: --manifest-url="" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950489 4790 flags.go:64] FLAG: --manifest-url-header="" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950497 4790 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950504 4790 flags.go:64] FLAG: --max-open-files="1000000" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950511 4790 flags.go:64] FLAG: --max-pods="110" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950518 4790 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950524 4790 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950530 4790 flags.go:64] FLAG: --memory-manager-policy="None" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950537 4790 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950543 4790 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950549 4790 flags.go:64] FLAG: --node-ip="192.168.34.10" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950556 4790 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950576 4790 flags.go:64] FLAG: --node-status-max-images="50" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950582 4790 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950589 4790 flags.go:64] FLAG: --oom-score-adj="-999" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950595 4790 flags.go:64] FLAG: --pod-cidr="" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950601 4790 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2d66b9dbe1d071d7372c477a78835fb65b48ea82db00d23e9086af5cfcb194ad" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950611 4790 flags.go:64] FLAG: --pod-manifest-path="" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950617 4790 flags.go:64] FLAG: --pod-max-pids="-1" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950624 4790 flags.go:64] FLAG: --pods-per-core="0" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950630 4790 flags.go:64] FLAG: --port="10250" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950637 4790 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950643 4790 flags.go:64] FLAG: --provider-id="" Oct 11 10:38:43.952283 master-0 kubenswrapper[4790]: I1011 10:38:43.950650 4790 flags.go:64] FLAG: --qos-reserved="" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950656 4790 flags.go:64] FLAG: --read-only-port="10255" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950662 4790 flags.go:64] FLAG: --register-node="true" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950668 4790 flags.go:64] FLAG: --register-schedulable="true" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950675 4790 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950687 4790 flags.go:64] FLAG: --registry-burst="10" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950695 4790 flags.go:64] FLAG: --registry-qps="5" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950703 4790 flags.go:64] FLAG: --reserved-cpus="" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950735 4790 flags.go:64] FLAG: --reserved-memory="" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950745 4790 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950753 4790 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950760 4790 flags.go:64] FLAG: --rotate-certificates="false" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950768 4790 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950775 4790 flags.go:64] FLAG: --runonce="false" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950782 4790 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950789 4790 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950797 4790 flags.go:64] FLAG: --seccomp-default="false" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950805 4790 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950813 4790 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950820 4790 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950827 4790 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950834 4790 flags.go:64] FLAG: --storage-driver-password="root" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950843 4790 flags.go:64] FLAG: --storage-driver-secure="false" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950850 4790 flags.go:64] FLAG: --storage-driver-table="stats" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950856 4790 flags.go:64] FLAG: --storage-driver-user="root" Oct 11 10:38:43.952807 master-0 kubenswrapper[4790]: I1011 10:38:43.950863 4790 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950869 4790 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950876 4790 flags.go:64] FLAG: --system-cgroups="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950882 4790 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950893 4790 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950898 4790 flags.go:64] FLAG: --tls-cert-file="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950904 4790 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950912 4790 flags.go:64] FLAG: --tls-min-version="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950918 4790 flags.go:64] FLAG: --tls-private-key-file="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950925 4790 flags.go:64] FLAG: --topology-manager-policy="none" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950931 4790 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950937 4790 flags.go:64] FLAG: --topology-manager-scope="container" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950943 4790 flags.go:64] FLAG: --v="2" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950958 4790 flags.go:64] FLAG: --version="false" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950967 4790 flags.go:64] FLAG: --vmodule="" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950975 4790 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: I1011 10:38:43.950982 4790 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951147 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951155 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951162 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951168 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951174 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951179 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951184 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 10:38:43.953319 master-0 kubenswrapper[4790]: W1011 10:38:43.951189 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951195 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951200 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951205 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951211 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951219 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951224 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951229 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951234 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951240 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951245 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951250 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951257 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951263 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951269 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951274 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951279 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951285 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951290 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951295 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 10:38:43.953871 master-0 kubenswrapper[4790]: W1011 10:38:43.951301 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951306 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951311 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951316 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951323 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951328 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951333 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951339 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951345 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951351 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951358 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951365 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951373 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951379 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951386 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951395 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951404 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951416 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951423 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 10:38:43.954322 master-0 kubenswrapper[4790]: W1011 10:38:43.951428 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951434 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951439 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951446 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951453 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951458 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951464 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951470 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951475 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951480 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951486 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951491 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951496 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951501 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951507 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951512 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951516 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951523 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951529 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951534 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 10:38:43.954762 master-0 kubenswrapper[4790]: W1011 10:38:43.951539 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 10:38:43.955212 master-0 kubenswrapper[4790]: W1011 10:38:43.951544 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 10:38:43.955212 master-0 kubenswrapper[4790]: W1011 10:38:43.951549 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 10:38:43.955212 master-0 kubenswrapper[4790]: W1011 10:38:43.951555 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 11 10:38:43.955212 master-0 kubenswrapper[4790]: W1011 10:38:43.951561 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 10:38:43.955212 master-0 kubenswrapper[4790]: W1011 10:38:43.951566 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 10:38:43.955212 master-0 kubenswrapper[4790]: I1011 10:38:43.952358 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 10:38:43.965414 master-0 kubenswrapper[4790]: I1011 10:38:43.965331 4790 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Oct 11 10:38:43.965414 master-0 kubenswrapper[4790]: I1011 10:38:43.965397 4790 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 11 10:38:43.965622 master-0 kubenswrapper[4790]: W1011 10:38:43.965588 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 10:38:43.965622 master-0 kubenswrapper[4790]: W1011 10:38:43.965612 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 10:38:43.965622 master-0 kubenswrapper[4790]: W1011 10:38:43.965623 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965634 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965643 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965651 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965659 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965667 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965675 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965684 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965691 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965700 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965732 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965743 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 10:38:43.965700 master-0 kubenswrapper[4790]: W1011 10:38:43.965757 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965767 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965776 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965785 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965796 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965804 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965814 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965822 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965831 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965839 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965848 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965856 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965865 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965873 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965884 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965893 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965901 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965909 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965917 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 10:38:43.966046 master-0 kubenswrapper[4790]: W1011 10:38:43.965926 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965935 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965943 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965951 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965959 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965967 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965975 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965983 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.965993 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966002 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966011 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966020 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966027 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966035 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966042 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966053 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966062 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966069 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966077 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966085 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 10:38:43.966537 master-0 kubenswrapper[4790]: W1011 10:38:43.966093 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966103 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966110 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966119 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966126 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966134 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966142 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966150 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966157 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966165 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966172 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966180 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966188 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966196 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966205 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966213 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966221 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966232 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 10:38:43.967197 master-0 kubenswrapper[4790]: W1011 10:38:43.966241 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 10:38:43.967628 master-0 kubenswrapper[4790]: I1011 10:38:43.966256 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 10:38:43.970260 master-0 kubenswrapper[4790]: W1011 10:38:43.970186 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 10:38:43.970260 master-0 kubenswrapper[4790]: W1011 10:38:43.970252 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 11 10:38:43.970260 master-0 kubenswrapper[4790]: W1011 10:38:43.970262 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970271 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970285 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970301 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970310 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970319 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970327 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970335 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970343 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970351 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970359 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970366 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970376 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970384 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 10:38:43.970371 master-0 kubenswrapper[4790]: W1011 10:38:43.970392 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970400 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970409 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970417 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970425 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970432 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970440 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970451 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970461 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970471 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970483 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970492 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970500 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970509 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970518 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970525 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970533 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970541 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970548 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 10:38:43.970875 master-0 kubenswrapper[4790]: W1011 10:38:43.970558 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970568 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970578 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970587 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970595 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970603 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970611 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970618 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970626 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970634 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970641 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970650 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970658 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970666 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970674 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970682 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970690 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970698 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970734 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970743 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 10:38:43.971462 master-0 kubenswrapper[4790]: W1011 10:38:43.970751 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970760 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970768 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970775 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970783 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970794 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970804 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970813 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970822 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970830 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970837 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970846 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970853 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970861 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970868 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970876 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 10:38:43.971935 master-0 kubenswrapper[4790]: W1011 10:38:43.970884 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 10:38:43.972285 master-0 kubenswrapper[4790]: I1011 10:38:43.970898 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 10:38:43.972285 master-0 kubenswrapper[4790]: I1011 10:38:43.971240 4790 server.go:940] "Client rotation is on, will bootstrap in background" Oct 11 10:38:43.975142 master-0 kubenswrapper[4790]: I1011 10:38:43.975104 4790 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Oct 11 10:38:43.976780 master-0 kubenswrapper[4790]: I1011 10:38:43.976748 4790 server.go:997] "Starting client certificate rotation" Oct 11 10:38:43.976814 master-0 kubenswrapper[4790]: I1011 10:38:43.976789 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 11 10:38:43.977041 master-0 kubenswrapper[4790]: I1011 10:38:43.976992 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Oct 11 10:38:44.004104 master-0 kubenswrapper[4790]: I1011 10:38:44.004001 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 10:38:44.006921 master-0 kubenswrapper[4790]: I1011 10:38:44.006830 4790 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 10:38:44.024389 master-0 kubenswrapper[4790]: I1011 10:38:44.024161 4790 log.go:25] "Validated CRI v1 runtime API" Oct 11 10:38:44.031157 master-0 kubenswrapper[4790]: I1011 10:38:44.031103 4790 log.go:25] "Validated CRI v1 image API" Oct 11 10:38:44.034209 master-0 kubenswrapper[4790]: I1011 10:38:44.034149 4790 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 11 10:38:44.040884 master-0 kubenswrapper[4790]: I1011 10:38:44.040820 4790 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 7ed13c62-ecfa-44fd-93db-2cdfc620f24a:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Oct 11 10:38:44.040992 master-0 kubenswrapper[4790]: I1011 10:38:44.040874 4790 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Oct 11 10:38:44.072217 master-0 kubenswrapper[4790]: I1011 10:38:44.072069 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Oct 11 10:38:44.081315 master-0 kubenswrapper[4790]: I1011 10:38:44.080831 4790 manager.go:217] Machine: {Timestamp:2025-10-11 10:38:44.078879165 +0000 UTC m=+0.633339537 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ef9a84db9c494bb2987b8bfc3ab9d214 SystemUUID:ef9a84db-9c49-4bb2-987b-8bfc3ab9d214 BootID:e436c54c-2677-4e8c-8717-eb2ef57d6e68 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:05:d1:c8 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:3e:05:d1:c8 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:19:d3:60 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:cb:37:ae Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:72:f3:09:61:c2:18 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 11 10:38:44.081315 master-0 kubenswrapper[4790]: I1011 10:38:44.081282 4790 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 11 10:38:44.081534 master-0 kubenswrapper[4790]: I1011 10:38:44.081486 4790 manager.go:233] Version: {KernelVersion:5.14.0-427.91.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202509241235-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 11 10:38:44.082009 master-0 kubenswrapper[4790]: I1011 10:38:44.081969 4790 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 11 10:38:44.082356 master-0 kubenswrapper[4790]: I1011 10:38:44.082288 4790 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 11 10:38:44.082680 master-0 kubenswrapper[4790]: I1011 10:38:44.082352 4790 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 11 10:38:44.083859 master-0 kubenswrapper[4790]: I1011 10:38:44.083823 4790 topology_manager.go:138] "Creating topology manager with none policy" Oct 11 10:38:44.083859 master-0 kubenswrapper[4790]: I1011 10:38:44.083859 4790 container_manager_linux.go:303] "Creating device plugin manager" Oct 11 10:38:44.083982 master-0 kubenswrapper[4790]: I1011 10:38:44.083885 4790 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 11 10:38:44.083982 master-0 kubenswrapper[4790]: I1011 10:38:44.083911 4790 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 11 10:38:44.085816 master-0 kubenswrapper[4790]: I1011 10:38:44.085777 4790 state_mem.go:36] "Initialized new in-memory state store" Oct 11 10:38:44.085953 master-0 kubenswrapper[4790]: I1011 10:38:44.085921 4790 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 11 10:38:44.091314 master-0 kubenswrapper[4790]: I1011 10:38:44.091276 4790 kubelet.go:418] "Attempting to sync node with API server" Oct 11 10:38:44.091314 master-0 kubenswrapper[4790]: I1011 10:38:44.091316 4790 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 11 10:38:44.091514 master-0 kubenswrapper[4790]: I1011 10:38:44.091482 4790 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 11 10:38:44.091514 master-0 kubenswrapper[4790]: I1011 10:38:44.091511 4790 kubelet.go:324] "Adding apiserver pod source" Oct 11 10:38:44.091651 master-0 kubenswrapper[4790]: I1011 10:38:44.091533 4790 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 11 10:38:44.098532 master-0 kubenswrapper[4790]: I1011 10:38:44.098460 4790 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.12-3.rhaos4.18.gitdc59c78.el9" apiVersion="v1" Oct 11 10:38:44.102667 master-0 kubenswrapper[4790]: I1011 10:38:44.102607 4790 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 11 10:38:44.103040 master-0 kubenswrapper[4790]: I1011 10:38:44.102967 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 11 10:38:44.103040 master-0 kubenswrapper[4790]: I1011 10:38:44.103017 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 11 10:38:44.103040 master-0 kubenswrapper[4790]: I1011 10:38:44.103030 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 11 10:38:44.103040 master-0 kubenswrapper[4790]: I1011 10:38:44.103040 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 11 10:38:44.103040 master-0 kubenswrapper[4790]: I1011 10:38:44.103055 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 11 10:38:44.103361 master-0 kubenswrapper[4790]: I1011 10:38:44.103067 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 11 10:38:44.103361 master-0 kubenswrapper[4790]: I1011 10:38:44.103077 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 11 10:38:44.103361 master-0 kubenswrapper[4790]: I1011 10:38:44.103094 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 11 10:38:44.103361 master-0 kubenswrapper[4790]: I1011 10:38:44.103105 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 11 10:38:44.103361 master-0 kubenswrapper[4790]: I1011 10:38:44.103115 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 11 10:38:44.103361 master-0 kubenswrapper[4790]: I1011 10:38:44.103132 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 11 10:38:44.103805 master-0 kubenswrapper[4790]: I1011 10:38:44.103765 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 11 10:38:44.105330 master-0 kubenswrapper[4790]: W1011 10:38:44.105260 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Oct 11 10:38:44.105486 master-0 kubenswrapper[4790]: W1011 10:38:44.105422 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Oct 11 10:38:44.105560 master-0 kubenswrapper[4790]: E1011 10:38:44.105465 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:44.105560 master-0 kubenswrapper[4790]: E1011 10:38:44.105501 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:44.105966 master-0 kubenswrapper[4790]: I1011 10:38:44.105922 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 11 10:38:44.106617 master-0 kubenswrapper[4790]: I1011 10:38:44.106580 4790 server.go:1280] "Started kubelet" Oct 11 10:38:44.106860 master-0 kubenswrapper[4790]: I1011 10:38:44.106777 4790 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 11 10:38:44.107028 master-0 kubenswrapper[4790]: I1011 10:38:44.106923 4790 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 11 10:38:44.107104 master-0 kubenswrapper[4790]: I1011 10:38:44.107067 4790 server_v1.go:47] "podresources" method="list" useActivePods=true Oct 11 10:38:44.107941 master-0 kubenswrapper[4790]: I1011 10:38:44.107882 4790 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 11 10:38:44.108449 master-0 systemd[1]: Started Kubernetes Kubelet. Oct 11 10:38:44.112909 master-0 kubenswrapper[4790]: I1011 10:38:44.112855 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 11 10:38:44.112909 master-0 kubenswrapper[4790]: I1011 10:38:44.112904 4790 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 11 10:38:44.114095 master-0 kubenswrapper[4790]: E1011 10:38:44.113986 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Oct 11 10:38:44.115359 master-0 kubenswrapper[4790]: I1011 10:38:44.114592 4790 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 11 10:38:44.115359 master-0 kubenswrapper[4790]: I1011 10:38:44.114825 4790 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 11 10:38:44.115359 master-0 kubenswrapper[4790]: I1011 10:38:44.115368 4790 reconstruct.go:97] "Volume reconstruction finished" Oct 11 10:38:44.115686 master-0 kubenswrapper[4790]: I1011 10:38:44.115411 4790 reconciler.go:26] "Reconciler: start to sync state" Oct 11 10:38:44.116311 master-0 kubenswrapper[4790]: I1011 10:38:44.116121 4790 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Oct 11 10:38:44.117001 master-0 kubenswrapper[4790]: I1011 10:38:44.116927 4790 server.go:449] "Adding debug handlers to kubelet server" Oct 11 10:38:44.119410 master-0 kubenswrapper[4790]: E1011 10:38:44.119299 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Oct 11 10:38:44.119821 master-0 kubenswrapper[4790]: I1011 10:38:44.119751 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:44.120057 master-0 kubenswrapper[4790]: W1011 10:38:44.119843 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:44.120345 master-0 kubenswrapper[4790]: E1011 10:38:44.120279 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:44.121109 master-0 kubenswrapper[4790]: E1011 10:38:44.119946 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d6996696db8b7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.106541239 +0000 UTC m=+0.661001541,LastTimestamp:2025-10-11 10:38:44.106541239 +0000 UTC m=+0.661001541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.126035 master-0 kubenswrapper[4790]: I1011 10:38:44.125986 4790 factory.go:55] Registering systemd factory Oct 11 10:38:44.126035 master-0 kubenswrapper[4790]: I1011 10:38:44.126036 4790 factory.go:221] Registration of the systemd container factory successfully Oct 11 10:38:44.127040 master-0 kubenswrapper[4790]: I1011 10:38:44.126995 4790 factory.go:153] Registering CRI-O factory Oct 11 10:38:44.127040 master-0 kubenswrapper[4790]: I1011 10:38:44.127031 4790 factory.go:221] Registration of the crio container factory successfully Oct 11 10:38:44.127213 master-0 kubenswrapper[4790]: I1011 10:38:44.127138 4790 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 11 10:38:44.127213 master-0 kubenswrapper[4790]: I1011 10:38:44.127187 4790 factory.go:103] Registering Raw factory Oct 11 10:38:44.127213 master-0 kubenswrapper[4790]: I1011 10:38:44.127213 4790 manager.go:1196] Started watching for new ooms in manager Oct 11 10:38:44.128338 master-0 kubenswrapper[4790]: I1011 10:38:44.128281 4790 manager.go:319] Starting recovery of all containers Oct 11 10:38:44.136531 master-0 kubenswrapper[4790]: E1011 10:38:44.136478 4790 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Oct 11 10:38:44.154937 master-0 kubenswrapper[4790]: I1011 10:38:44.154890 4790 manager.go:324] Recovery completed Oct 11 10:38:44.167532 master-0 kubenswrapper[4790]: I1011 10:38:44.167493 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:44.169143 master-0 kubenswrapper[4790]: I1011 10:38:44.169081 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:44.169143 master-0 kubenswrapper[4790]: I1011 10:38:44.169133 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:44.169143 master-0 kubenswrapper[4790]: I1011 10:38:44.169147 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:44.171274 master-0 kubenswrapper[4790]: I1011 10:38:44.171229 4790 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 11 10:38:44.171274 master-0 kubenswrapper[4790]: I1011 10:38:44.171252 4790 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 11 10:38:44.171274 master-0 kubenswrapper[4790]: I1011 10:38:44.171278 4790 state_mem.go:36] "Initialized new in-memory state store" Oct 11 10:38:44.172364 master-0 kubenswrapper[4790]: E1011 10:38:44.172081 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.175308 master-0 kubenswrapper[4790]: I1011 10:38:44.175272 4790 policy_none.go:49] "None policy: Start" Oct 11 10:38:44.176929 master-0 kubenswrapper[4790]: I1011 10:38:44.176860 4790 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 11 10:38:44.177187 master-0 kubenswrapper[4790]: I1011 10:38:44.177014 4790 state_mem.go:35] "Initializing new in-memory state store" Oct 11 10:38:44.180090 master-0 kubenswrapper[4790]: E1011 10:38:44.179915 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.188854 master-0 kubenswrapper[4790]: E1011 10:38:44.188755 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.214327 master-0 kubenswrapper[4790]: E1011 10:38:44.214232 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Oct 11 10:38:44.251483 master-0 kubenswrapper[4790]: I1011 10:38:44.251320 4790 manager.go:334] "Starting Device Plugin manager" Oct 11 10:38:44.251781 master-0 kubenswrapper[4790]: I1011 10:38:44.251493 4790 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 11 10:38:44.251781 master-0 kubenswrapper[4790]: I1011 10:38:44.251517 4790 server.go:79] "Starting device plugin registration server" Oct 11 10:38:44.252197 master-0 kubenswrapper[4790]: I1011 10:38:44.252150 4790 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 11 10:38:44.252287 master-0 kubenswrapper[4790]: I1011 10:38:44.252180 4790 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 11 10:38:44.253074 master-0 kubenswrapper[4790]: I1011 10:38:44.252995 4790 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 11 10:38:44.253387 master-0 kubenswrapper[4790]: I1011 10:38:44.253172 4790 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 11 10:38:44.253387 master-0 kubenswrapper[4790]: I1011 10:38:44.253188 4790 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 11 10:38:44.255114 master-0 kubenswrapper[4790]: E1011 10:38:44.254955 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Oct 11 10:38:44.267802 master-0 kubenswrapper[4790]: E1011 10:38:44.267555 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69967266ce36 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.257082934 +0000 UTC m=+0.811543266,LastTimestamp:2025-10-11 10:38:44.257082934 +0000 UTC m=+0.811543266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.288145 master-0 kubenswrapper[4790]: I1011 10:38:44.287995 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 11 10:38:44.291128 master-0 kubenswrapper[4790]: I1011 10:38:44.291071 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 11 10:38:44.291240 master-0 kubenswrapper[4790]: I1011 10:38:44.291155 4790 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 11 10:38:44.291240 master-0 kubenswrapper[4790]: I1011 10:38:44.291198 4790 kubelet.go:2335] "Starting kubelet main sync loop" Oct 11 10:38:44.291362 master-0 kubenswrapper[4790]: E1011 10:38:44.291278 4790 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Oct 11 10:38:44.301419 master-0 kubenswrapper[4790]: W1011 10:38:44.301330 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Oct 11 10:38:44.301580 master-0 kubenswrapper[4790]: E1011 10:38:44.301429 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:44.328186 master-0 kubenswrapper[4790]: E1011 10:38:44.328052 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Oct 11 10:38:44.353370 master-0 kubenswrapper[4790]: I1011 10:38:44.353279 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:44.355647 master-0 kubenswrapper[4790]: I1011 10:38:44.355595 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:44.355647 master-0 kubenswrapper[4790]: I1011 10:38:44.355651 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:44.355870 master-0 kubenswrapper[4790]: I1011 10:38:44.355663 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:44.355870 master-0 kubenswrapper[4790]: I1011 10:38:44.355723 4790 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Oct 11 10:38:44.363626 master-0 kubenswrapper[4790]: E1011 10:38:44.363569 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Oct 11 10:38:44.364027 master-0 kubenswrapper[4790]: E1011 10:38:44.363887 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:44.355632298 +0000 UTC m=+0.910092590,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.374550 master-0 kubenswrapper[4790]: E1011 10:38:44.374320 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:44.355658363 +0000 UTC m=+0.910118655,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.383372 master-0 kubenswrapper[4790]: E1011 10:38:44.383083 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d291d68\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:44.355668436 +0000 UTC m=+0.910128728,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.392214 master-0 kubenswrapper[4790]: I1011 10:38:44.392106 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Oct 11 10:38:44.392382 master-0 kubenswrapper[4790]: I1011 10:38:44.392249 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:44.394815 master-0 kubenswrapper[4790]: I1011 10:38:44.394765 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:44.394957 master-0 kubenswrapper[4790]: I1011 10:38:44.394830 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:44.394957 master-0 kubenswrapper[4790]: I1011 10:38:44.394849 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:44.395207 master-0 kubenswrapper[4790]: I1011 10:38:44.395162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.395302 master-0 kubenswrapper[4790]: I1011 10:38:44.395218 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:44.396911 master-0 kubenswrapper[4790]: I1011 10:38:44.396858 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:44.397030 master-0 kubenswrapper[4790]: I1011 10:38:44.396925 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:44.397030 master-0 kubenswrapper[4790]: I1011 10:38:44.396947 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:44.406115 master-0 kubenswrapper[4790]: E1011 10:38:44.405895 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:44.394806181 +0000 UTC m=+0.949266503,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.413640 master-0 kubenswrapper[4790]: E1011 10:38:44.413427 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:44.394842059 +0000 UTC m=+0.949302381,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.424564 master-0 kubenswrapper[4790]: E1011 10:38:44.424402 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d291d68\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:44.394858642 +0000 UTC m=+0.949318964,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.431455 master-0 kubenswrapper[4790]: E1011 10:38:44.431234 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:44.396899262 +0000 UTC m=+0.951359584,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.442489 master-0 kubenswrapper[4790]: E1011 10:38:44.442349 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:44.396938342 +0000 UTC m=+0.951398674,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.450291 master-0 kubenswrapper[4790]: E1011 10:38:44.450103 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d291d68\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:44.396957846 +0000 UTC m=+0.951418168,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.517529 master-0 kubenswrapper[4790]: I1011 10:38:44.517403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/117b8efe269c98124cf5022ab3c340a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"117b8efe269c98124cf5022ab3c340a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.517529 master-0 kubenswrapper[4790]: I1011 10:38:44.517489 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/117b8efe269c98124cf5022ab3c340a5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"117b8efe269c98124cf5022ab3c340a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.564797 master-0 kubenswrapper[4790]: I1011 10:38:44.564661 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:44.566686 master-0 kubenswrapper[4790]: I1011 10:38:44.566578 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:44.566847 master-0 kubenswrapper[4790]: I1011 10:38:44.566801 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:44.566847 master-0 kubenswrapper[4790]: I1011 10:38:44.566835 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:44.566980 master-0 kubenswrapper[4790]: I1011 10:38:44.566902 4790 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Oct 11 10:38:44.575560 master-0 kubenswrapper[4790]: E1011 10:38:44.575487 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Oct 11 10:38:44.575818 master-0 kubenswrapper[4790]: E1011 10:38:44.575615 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:44.566662264 +0000 UTC m=+1.121122636,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.586389 master-0 kubenswrapper[4790]: E1011 10:38:44.586171 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:44.566824158 +0000 UTC m=+1.121284490,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.594748 master-0 kubenswrapper[4790]: E1011 10:38:44.594546 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d291d68\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:44.566849374 +0000 UTC m=+1.121309706,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.618821 master-0 kubenswrapper[4790]: I1011 10:38:44.618682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/117b8efe269c98124cf5022ab3c340a5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"117b8efe269c98124cf5022ab3c340a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.618821 master-0 kubenswrapper[4790]: I1011 10:38:44.618788 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/117b8efe269c98124cf5022ab3c340a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"117b8efe269c98124cf5022ab3c340a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.619101 master-0 kubenswrapper[4790]: I1011 10:38:44.618886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/117b8efe269c98124cf5022ab3c340a5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"117b8efe269c98124cf5022ab3c340a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.619101 master-0 kubenswrapper[4790]: I1011 10:38:44.618964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/117b8efe269c98124cf5022ab3c340a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"117b8efe269c98124cf5022ab3c340a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.731055 master-0 kubenswrapper[4790]: I1011 10:38:44.730792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Oct 11 10:38:44.738774 master-0 kubenswrapper[4790]: E1011 10:38:44.738651 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Oct 11 10:38:44.976239 master-0 kubenswrapper[4790]: I1011 10:38:44.976085 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:44.978993 master-0 kubenswrapper[4790]: I1011 10:38:44.978942 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:44.979106 master-0 kubenswrapper[4790]: I1011 10:38:44.979089 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:44.979221 master-0 kubenswrapper[4790]: I1011 10:38:44.979189 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:44.979292 master-0 kubenswrapper[4790]: I1011 10:38:44.979257 4790 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Oct 11 10:38:44.989581 master-0 kubenswrapper[4790]: E1011 10:38:44.989444 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Oct 11 10:38:44.990659 master-0 kubenswrapper[4790]: E1011 10:38:44.990478 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:44.979074842 +0000 UTC m=+1.533535154,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:44.999132 master-0 kubenswrapper[4790]: E1011 10:38:44.998996 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:44.979140695 +0000 UTC m=+1.533601007,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:45.007750 master-0 kubenswrapper[4790]: E1011 10:38:45.007542 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d291d68\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:44.97920904 +0000 UTC m=+1.533669342,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:45.128686 master-0 kubenswrapper[4790]: I1011 10:38:45.128538 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:45.145394 master-0 kubenswrapper[4790]: W1011 10:38:45.145297 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Oct 11 10:38:45.145394 master-0 kubenswrapper[4790]: E1011 10:38:45.145376 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:45.420764 master-0 kubenswrapper[4790]: W1011 10:38:45.420356 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod117b8efe269c98124cf5022ab3c340a5.slice/crio-adfad26afad6055735ede8e6a1ce844b2d8bb91640a5c927f437898a49285a03 WatchSource:0}: Error finding container adfad26afad6055735ede8e6a1ce844b2d8bb91640a5c927f437898a49285a03: Status 404 returned error can't find the container with id adfad26afad6055735ede8e6a1ce844b2d8bb91640a5c927f437898a49285a03 Oct 11 10:38:45.431340 master-0 kubenswrapper[4790]: I1011 10:38:45.431290 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 10:38:45.440112 master-0 kubenswrapper[4790]: E1011 10:38:45.439963 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.186d6996b862ee69 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:117b8efe269c98124cf5022ab3c340a5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f22b65e5c744a32d3955dd7c36d809e3114a8aa501b44c00330dfda886c21169\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:45.431234153 +0000 UTC m=+1.985694455,LastTimestamp:2025-10-11 10:38:45.431234153 +0000 UTC m=+1.985694455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:45.463213 master-0 kubenswrapper[4790]: W1011 10:38:45.463125 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Oct 11 10:38:45.463213 master-0 kubenswrapper[4790]: E1011 10:38:45.463209 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:45.547473 master-0 kubenswrapper[4790]: E1011 10:38:45.547331 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Oct 11 10:38:45.547473 master-0 kubenswrapper[4790]: W1011 10:38:45.547458 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:45.547860 master-0 kubenswrapper[4790]: E1011 10:38:45.547516 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:45.675247 master-0 kubenswrapper[4790]: W1011 10:38:45.675038 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Oct 11 10:38:45.675247 master-0 kubenswrapper[4790]: E1011 10:38:45.675135 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:45.789988 master-0 kubenswrapper[4790]: I1011 10:38:45.789756 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:45.791875 master-0 kubenswrapper[4790]: I1011 10:38:45.791814 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:45.791875 master-0 kubenswrapper[4790]: I1011 10:38:45.791872 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:45.791875 master-0 kubenswrapper[4790]: I1011 10:38:45.791886 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:45.792205 master-0 kubenswrapper[4790]: I1011 10:38:45.791937 4790 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Oct 11 10:38:45.799776 master-0 kubenswrapper[4790]: E1011 10:38:45.799654 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Oct 11 10:38:45.799982 master-0 kubenswrapper[4790]: E1011 10:38:45.799789 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:45.791853047 +0000 UTC m=+2.346313349,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:45.804938 master-0 kubenswrapper[4790]: E1011 10:38:45.804365 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:45.791881578 +0000 UTC m=+2.346341880,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:45.809641 master-0 kubenswrapper[4790]: E1011 10:38:45.809450 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d291d68\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d291d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169153896 +0000 UTC m=+0.723614198,LastTimestamp:2025-10-11 10:38:45.791894568 +0000 UTC m=+2.346354870,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:46.125306 master-0 kubenswrapper[4790]: I1011 10:38:46.125211 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:46.300079 master-0 kubenswrapper[4790]: I1011 10:38:46.299880 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"117b8efe269c98124cf5022ab3c340a5","Type":"ContainerStarted","Data":"adfad26afad6055735ede8e6a1ce844b2d8bb91640a5c927f437898a49285a03"} Oct 11 10:38:47.131021 master-0 kubenswrapper[4790]: I1011 10:38:47.130849 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:47.158044 master-0 kubenswrapper[4790]: E1011 10:38:47.157930 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Oct 11 10:38:47.401034 master-0 kubenswrapper[4790]: I1011 10:38:47.400619 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:47.402439 master-0 kubenswrapper[4790]: I1011 10:38:47.402392 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:47.402589 master-0 kubenswrapper[4790]: I1011 10:38:47.402459 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:47.402589 master-0 kubenswrapper[4790]: I1011 10:38:47.402482 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:47.402589 master-0 kubenswrapper[4790]: I1011 10:38:47.402538 4790 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Oct 11 10:38:47.412051 master-0 kubenswrapper[4790]: E1011 10:38:47.411903 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28a0cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28a0cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169121995 +0000 UTC m=+0.723582297,LastTimestamp:2025-10-11 10:38:47.402433609 +0000 UTC m=+3.956893931,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:47.412405 master-0 kubenswrapper[4790]: E1011 10:38:47.412342 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Oct 11 10:38:47.421535 master-0 kubenswrapper[4790]: E1011 10:38:47.421455 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.186d69966d28eb17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.186d69966d28eb17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-10-11 10:38:44.169141015 +0000 UTC m=+0.723601317,LastTimestamp:2025-10-11 10:38:47.4024716 +0000 UTC m=+3.956931922,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Oct 11 10:38:47.611938 master-0 kubenswrapper[4790]: W1011 10:38:47.611822 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 11 10:38:47.611938 master-0 kubenswrapper[4790]: E1011 10:38:47.611909 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:47.765596 master-0 kubenswrapper[4790]: I1011 10:38:47.765350 4790 csr.go:261] certificate signing request csr-bl7b7 is approved, waiting to be issued Oct 11 10:38:47.809323 master-0 kubenswrapper[4790]: W1011 10:38:47.809236 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Oct 11 10:38:47.809323 master-0 kubenswrapper[4790]: E1011 10:38:47.809311 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 11 10:38:47.834906 master-0 kubenswrapper[4790]: I1011 10:38:47.834793 4790 csr.go:257] certificate signing request csr-bl7b7 is issued Oct 11 10:38:47.978499 master-0 kubenswrapper[4790]: I1011 10:38:47.978321 4790 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Oct 11 10:38:48.147455 master-0 kubenswrapper[4790]: I1011 10:38:48.147328 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.167177 master-0 kubenswrapper[4790]: I1011 10:38:48.167070 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.363562 master-0 kubenswrapper[4790]: I1011 10:38:48.363480 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.386506 master-0 kubenswrapper[4790]: I1011 10:38:48.386446 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 11 10:38:48.429863 master-0 kubenswrapper[4790]: I1011 10:38:48.429606 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 11 10:38:48.692285 master-0 kubenswrapper[4790]: I1011 10:38:48.692075 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.692285 master-0 kubenswrapper[4790]: E1011 10:38:48.692139 4790 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Oct 11 10:38:48.720533 master-0 kubenswrapper[4790]: I1011 10:38:48.720445 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.741284 master-0 kubenswrapper[4790]: I1011 10:38:48.741193 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.804757 master-0 kubenswrapper[4790]: I1011 10:38:48.804616 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:48.837642 master-0 kubenswrapper[4790]: I1011 10:38:48.837535 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-10-12 10:21:12 +0000 UTC, rotation deadline is 2025-10-12 07:14:08.069129423 +0000 UTC Oct 11 10:38:48.837642 master-0 kubenswrapper[4790]: I1011 10:38:48.837604 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h35m19.231532581s for next certificate rotation Oct 11 10:38:49.066651 master-0 kubenswrapper[4790]: I1011 10:38:49.066582 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:49.066651 master-0 kubenswrapper[4790]: E1011 10:38:49.066624 4790 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Oct 11 10:38:49.094995 master-0 kubenswrapper[4790]: I1011 10:38:49.094925 4790 apiserver.go:52] "Watching apiserver" Oct 11 10:38:49.097767 master-0 kubenswrapper[4790]: I1011 10:38:49.097576 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 11 10:38:49.098050 master-0 kubenswrapper[4790]: I1011 10:38:49.097804 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=[] Oct 11 10:38:49.116925 master-0 kubenswrapper[4790]: I1011 10:38:49.116839 4790 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Oct 11 10:38:49.173064 master-0 kubenswrapper[4790]: I1011 10:38:49.172999 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:49.194382 master-0 kubenswrapper[4790]: I1011 10:38:49.194293 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:49.257359 master-0 kubenswrapper[4790]: I1011 10:38:49.257201 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:49.307842 master-0 kubenswrapper[4790]: I1011 10:38:49.307776 4790 generic.go:334] "Generic (PLEG): container finished" podID="117b8efe269c98124cf5022ab3c340a5" containerID="2773ca4b00e741b3dd67df737e99e7af029b77cdda7febc0ccb0b23ed8efcf99" exitCode=0 Oct 11 10:38:49.307842 master-0 kubenswrapper[4790]: I1011 10:38:49.307826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"117b8efe269c98124cf5022ab3c340a5","Type":"ContainerDied","Data":"2773ca4b00e741b3dd67df737e99e7af029b77cdda7febc0ccb0b23ed8efcf99"} Oct 11 10:38:49.308117 master-0 kubenswrapper[4790]: I1011 10:38:49.307999 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:49.309332 master-0 kubenswrapper[4790]: I1011 10:38:49.309280 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:49.309467 master-0 kubenswrapper[4790]: I1011 10:38:49.309351 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:49.309467 master-0 kubenswrapper[4790]: I1011 10:38:49.309377 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:49.537801 master-0 kubenswrapper[4790]: I1011 10:38:49.537748 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:49.537801 master-0 kubenswrapper[4790]: E1011 10:38:49.537802 4790 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Oct 11 10:38:50.098186 master-0 kubenswrapper[4790]: I1011 10:38:50.098121 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:50.117732 master-0 kubenswrapper[4790]: I1011 10:38:50.117667 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:50.180030 master-0 kubenswrapper[4790]: I1011 10:38:50.179968 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:50.312755 master-0 kubenswrapper[4790]: I1011 10:38:50.312613 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_117b8efe269c98124cf5022ab3c340a5/kube-rbac-proxy-crio/0.log" Oct 11 10:38:50.313641 master-0 kubenswrapper[4790]: I1011 10:38:50.313378 4790 generic.go:334] "Generic (PLEG): container finished" podID="117b8efe269c98124cf5022ab3c340a5" containerID="7c490e79b5a38a970413fe2dad2770b3c117205fbbd292831a37ed9dbecf228f" exitCode=1 Oct 11 10:38:50.313641 master-0 kubenswrapper[4790]: I1011 10:38:50.313476 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:50.313641 master-0 kubenswrapper[4790]: I1011 10:38:50.313471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"117b8efe269c98124cf5022ab3c340a5","Type":"ContainerDied","Data":"7c490e79b5a38a970413fe2dad2770b3c117205fbbd292831a37ed9dbecf228f"} Oct 11 10:38:50.315034 master-0 kubenswrapper[4790]: I1011 10:38:50.314889 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:50.315034 master-0 kubenswrapper[4790]: I1011 10:38:50.314961 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:50.315034 master-0 kubenswrapper[4790]: I1011 10:38:50.314975 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:50.329833 master-0 kubenswrapper[4790]: I1011 10:38:50.329656 4790 scope.go:117] "RemoveContainer" containerID="7c490e79b5a38a970413fe2dad2770b3c117205fbbd292831a37ed9dbecf228f" Oct 11 10:38:50.369858 master-0 kubenswrapper[4790]: E1011 10:38:50.369784 4790 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Oct 11 10:38:50.461269 master-0 kubenswrapper[4790]: I1011 10:38:50.461195 4790 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Oct 11 10:38:50.461269 master-0 kubenswrapper[4790]: E1011 10:38:50.461237 4790 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Oct 11 10:38:50.613306 master-0 kubenswrapper[4790]: I1011 10:38:50.613079 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 10:38:50.614897 master-0 kubenswrapper[4790]: I1011 10:38:50.614841 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Oct 11 10:38:50.614984 master-0 kubenswrapper[4790]: I1011 10:38:50.614910 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Oct 11 10:38:50.614984 master-0 kubenswrapper[4790]: I1011 10:38:50.614931 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Oct 11 10:38:50.615093 master-0 kubenswrapper[4790]: I1011 10:38:50.615000 4790 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Oct 11 10:38:50.705684 master-0 kubenswrapper[4790]: I1011 10:38:50.705581 4790 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Oct 11 10:38:51.033322 master-0 kubenswrapper[4790]: I1011 10:38:51.033113 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 11 10:38:51.124014 master-0 kubenswrapper[4790]: I1011 10:38:51.123930 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Oct 11 10:38:51.155195 master-0 kubenswrapper[4790]: I1011 10:38:51.155150 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Oct 11 10:38:51.177517 master-0 kubenswrapper[4790]: I1011 10:38:51.177444 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43270: no serving certificate available for the kubelet" Oct 11 10:38:51.269817 master-0 kubenswrapper[4790]: I1011 10:38:51.269780 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43286: no serving certificate available for the kubelet" Oct 11 10:38:51.318627 master-0 kubenswrapper[4790]: I1011 10:38:51.318561 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_117b8efe269c98124cf5022ab3c340a5/kube-rbac-proxy-crio/1.log" Oct 11 10:38:51.319207 master-0 kubenswrapper[4790]: I1011 10:38:51.319169 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_117b8efe269c98124cf5022ab3c340a5/kube-rbac-proxy-crio/0.log" Oct 11 10:38:51.319728 master-0 kubenswrapper[4790]: I1011 10:38:51.319653 4790 generic.go:334] "Generic (PLEG): container finished" podID="117b8efe269c98124cf5022ab3c340a5" containerID="0dce5d6b10f720448939fdc7754f0953d8f5becf80952889b528531812732245" exitCode=1 Oct 11 10:38:51.319792 master-0 kubenswrapper[4790]: I1011 10:38:51.319756 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"117b8efe269c98124cf5022ab3c340a5","Type":"ContainerDied","Data":"0dce5d6b10f720448939fdc7754f0953d8f5becf80952889b528531812732245"} Oct 11 10:38:51.319890 master-0 kubenswrapper[4790]: I1011 10:38:51.319856 4790 scope.go:117] "RemoveContainer" containerID="7c490e79b5a38a970413fe2dad2770b3c117205fbbd292831a37ed9dbecf228f" Oct 11 10:38:51.349524 master-0 kubenswrapper[4790]: I1011 10:38:51.349441 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Oct 11 10:38:51.349809 master-0 kubenswrapper[4790]: I1011 10:38:51.349746 4790 scope.go:117] "RemoveContainer" containerID="0dce5d6b10f720448939fdc7754f0953d8f5becf80952889b528531812732245" Oct 11 10:38:51.350044 master-0 kubenswrapper[4790]: E1011 10:38:51.349992 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(117b8efe269c98124cf5022ab3c340a5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="117b8efe269c98124cf5022ab3c340a5" Oct 11 10:38:51.371527 master-0 kubenswrapper[4790]: I1011 10:38:51.371411 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43288: no serving certificate available for the kubelet" Oct 11 10:38:51.460458 master-0 kubenswrapper[4790]: I1011 10:38:51.460356 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43290: no serving certificate available for the kubelet" Oct 11 10:38:51.529090 master-0 kubenswrapper[4790]: I1011 10:38:51.529023 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43296: no serving certificate available for the kubelet" Oct 11 10:38:51.641655 master-0 kubenswrapper[4790]: I1011 10:38:51.641390 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43304: no serving certificate available for the kubelet" Oct 11 10:38:51.837006 master-0 kubenswrapper[4790]: I1011 10:38:51.836903 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43318: no serving certificate available for the kubelet" Oct 11 10:38:52.153661 master-0 kubenswrapper[4790]: I1011 10:38:52.153548 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 11 10:38:52.184540 master-0 kubenswrapper[4790]: I1011 10:38:52.184451 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43322: no serving certificate available for the kubelet" Oct 11 10:38:52.326032 master-0 kubenswrapper[4790]: I1011 10:38:52.325946 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_117b8efe269c98124cf5022ab3c340a5/kube-rbac-proxy-crio/1.log" Oct 11 10:38:52.327639 master-0 kubenswrapper[4790]: I1011 10:38:52.327572 4790 scope.go:117] "RemoveContainer" containerID="0dce5d6b10f720448939fdc7754f0953d8f5becf80952889b528531812732245" Oct 11 10:38:52.327983 master-0 kubenswrapper[4790]: E1011 10:38:52.327926 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(117b8efe269c98124cf5022ab3c340a5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="117b8efe269c98124cf5022ab3c340a5" Oct 11 10:38:52.863803 master-0 kubenswrapper[4790]: I1011 10:38:52.863667 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43330: no serving certificate available for the kubelet" Oct 11 10:38:54.166295 master-0 kubenswrapper[4790]: I1011 10:38:54.166187 4790 ???:1] "http: TLS handshake error from 192.168.34.11:44580: no serving certificate available for the kubelet" Oct 11 10:38:54.182147 master-0 kubenswrapper[4790]: I1011 10:38:54.182079 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43336: no serving certificate available for the kubelet" Oct 11 10:38:56.468154 master-0 kubenswrapper[4790]: I1011 10:38:56.468048 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-5-master-0"] Oct 11 10:38:56.469015 master-0 kubenswrapper[4790]: I1011 10:38:56.468384 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.469015 master-0 kubenswrapper[4790]: E1011 10:38:56.468518 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-5-master-0" podUID="795a4c8d-2d06-412c-a788-7e8585d432f7" Oct 11 10:38:56.597612 master-0 kubenswrapper[4790]: I1011 10:38:56.597512 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.597612 master-0 kubenswrapper[4790]: I1011 10:38:56.597593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-var-lock\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.597612 master-0 kubenswrapper[4790]: I1011 10:38:56.597613 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.698004 master-0 kubenswrapper[4790]: I1011 10:38:56.697907 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.698004 master-0 kubenswrapper[4790]: I1011 10:38:56.697989 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-var-lock\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.698004 master-0 kubenswrapper[4790]: I1011 10:38:56.698011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.698325 master-0 kubenswrapper[4790]: I1011 10:38:56.698163 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-var-lock\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.698325 master-0 kubenswrapper[4790]: I1011 10:38:56.698167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:56.729323 master-0 kubenswrapper[4790]: E1011 10:38:56.729213 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:56.729323 master-0 kubenswrapper[4790]: E1011 10:38:56.729263 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-5-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:56.729415 master-0 kubenswrapper[4790]: E1011 10:38:56.729348 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access podName:795a4c8d-2d06-412c-a788-7e8585d432f7 nodeName:}" failed. No retries permitted until 2025-10-11 10:38:57.229318496 +0000 UTC m=+13.783778788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access") pod "installer-5-master-0" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:56.783013 master-0 kubenswrapper[4790]: I1011 10:38:56.782952 4790 ???:1] "http: TLS handshake error from 192.168.34.12:43346: no serving certificate available for the kubelet" Oct 11 10:38:57.303540 master-0 kubenswrapper[4790]: I1011 10:38:57.303460 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:57.303839 master-0 kubenswrapper[4790]: E1011 10:38:57.303683 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:57.303839 master-0 kubenswrapper[4790]: E1011 10:38:57.303724 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-5-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:57.303839 master-0 kubenswrapper[4790]: E1011 10:38:57.303787 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access podName:795a4c8d-2d06-412c-a788-7e8585d432f7 nodeName:}" failed. No retries permitted until 2025-10-11 10:38:58.303767714 +0000 UTC m=+14.858228006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access") pod "installer-5-master-0" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:58.292135 master-0 kubenswrapper[4790]: I1011 10:38:58.292075 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:58.292769 master-0 kubenswrapper[4790]: E1011 10:38:58.292208 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-5-master-0" podUID="795a4c8d-2d06-412c-a788-7e8585d432f7" Oct 11 10:38:58.310905 master-0 kubenswrapper[4790]: I1011 10:38:58.310835 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:38:58.311240 master-0 kubenswrapper[4790]: E1011 10:38:58.310985 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:58.311240 master-0 kubenswrapper[4790]: E1011 10:38:58.311007 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-5-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:38:58.311240 master-0 kubenswrapper[4790]: E1011 10:38:58.311051 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access podName:795a4c8d-2d06-412c-a788-7e8585d432f7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:00.311032893 +0000 UTC m=+16.865493195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access") pod "installer-5-master-0" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:00.292680 master-0 kubenswrapper[4790]: I1011 10:39:00.292549 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:00.293747 master-0 kubenswrapper[4790]: E1011 10:39:00.292882 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-5-master-0" podUID="795a4c8d-2d06-412c-a788-7e8585d432f7" Oct 11 10:39:00.324948 master-0 kubenswrapper[4790]: I1011 10:39:00.324783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:00.325297 master-0 kubenswrapper[4790]: E1011 10:39:00.325114 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:00.325297 master-0 kubenswrapper[4790]: E1011 10:39:00.325160 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-5-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:00.325297 master-0 kubenswrapper[4790]: E1011 10:39:00.325260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access podName:795a4c8d-2d06-412c-a788-7e8585d432f7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:04.325229503 +0000 UTC m=+20.879689835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access") pod "installer-5-master-0" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:00.542194 master-0 kubenswrapper[4790]: I1011 10:39:00.542070 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5kghv"] Oct 11 10:39:00.543990 master-0 kubenswrapper[4790]: I1011 10:39:00.542546 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-85bvx"] Oct 11 10:39:00.543990 master-0 kubenswrapper[4790]: I1011 10:39:00.542687 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8lkdg"] Oct 11 10:39:00.543990 master-0 kubenswrapper[4790]: I1011 10:39:00.542813 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.543990 master-0 kubenswrapper[4790]: I1011 10:39:00.543114 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.543990 master-0 kubenswrapper[4790]: I1011 10:39:00.543841 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.547202 master-0 kubenswrapper[4790]: I1011 10:39:00.547166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 11 10:39:00.547304 master-0 kubenswrapper[4790]: I1011 10:39:00.547245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-c7nlq" Oct 11 10:39:00.547375 master-0 kubenswrapper[4790]: I1011 10:39:00.547277 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 11 10:39:00.547514 master-0 kubenswrapper[4790]: I1011 10:39:00.547465 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 11 10:39:00.547631 master-0 kubenswrapper[4790]: I1011 10:39:00.547601 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 11 10:39:00.547874 master-0 kubenswrapper[4790]: I1011 10:39:00.547837 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 11 10:39:00.547950 master-0 kubenswrapper[4790]: I1011 10:39:00.547843 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Oct 11 10:39:00.548213 master-0 kubenswrapper[4790]: I1011 10:39:00.548177 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 11 10:39:00.548386 master-0 kubenswrapper[4790]: I1011 10:39:00.548351 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Oct 11 10:39:00.548386 master-0 kubenswrapper[4790]: I1011 10:39:00.548362 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r499q"] Oct 11 10:39:00.548591 master-0 kubenswrapper[4790]: I1011 10:39:00.548551 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-hlrjh" Oct 11 10:39:00.548756 master-0 kubenswrapper[4790]: I1011 10:39:00.548644 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r499q" Oct 11 10:39:00.551401 master-0 kubenswrapper[4790]: I1011 10:39:00.551360 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ft6fv"] Oct 11 10:39:00.551859 master-0 kubenswrapper[4790]: I1011 10:39:00.551788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.552308 master-0 kubenswrapper[4790]: I1011 10:39:00.552269 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-5ddj4" Oct 11 10:39:00.552366 master-0 kubenswrapper[4790]: I1011 10:39:00.552352 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 11 10:39:00.552731 master-0 kubenswrapper[4790]: I1011 10:39:00.552664 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 11 10:39:00.552821 master-0 kubenswrapper[4790]: I1011 10:39:00.552788 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-kh4ld"] Oct 11 10:39:00.553203 master-0 kubenswrapper[4790]: I1011 10:39:00.553178 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.553768 master-0 kubenswrapper[4790]: I1011 10:39:00.553579 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 11 10:39:00.554113 master-0 kubenswrapper[4790]: I1011 10:39:00.554026 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 11 10:39:00.554588 master-0 kubenswrapper[4790]: I1011 10:39:00.554565 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Oct 11 10:39:00.555206 master-0 kubenswrapper[4790]: I1011 10:39:00.555051 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 11 10:39:00.555546 master-0 kubenswrapper[4790]: I1011 10:39:00.555304 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-7mxth" Oct 11 10:39:00.559690 master-0 kubenswrapper[4790]: I1011 10:39:00.559650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 11 10:39:00.559860 master-0 kubenswrapper[4790]: I1011 10:39:00.559692 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 11 10:39:00.559860 master-0 kubenswrapper[4790]: I1011 10:39:00.559752 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 11 10:39:00.559860 master-0 kubenswrapper[4790]: I1011 10:39:00.559853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 11 10:39:00.560083 master-0 kubenswrapper[4790]: I1011 10:39:00.559853 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-dockercfg-7xwqj" Oct 11 10:39:00.560083 master-0 kubenswrapper[4790]: I1011 10:39:00.559863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-4b7xp" Oct 11 10:39:00.560083 master-0 kubenswrapper[4790]: I1011 10:39:00.559958 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 11 10:39:00.570419 master-0 kubenswrapper[4790]: I1011 10:39:00.570374 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-96nq6"] Oct 11 10:39:00.571097 master-0 kubenswrapper[4790]: I1011 10:39:00.571066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.573455 master-0 kubenswrapper[4790]: I1011 10:39:00.573423 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 11 10:39:00.573455 master-0 kubenswrapper[4790]: I1011 10:39:00.573434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-dkksq" Oct 11 10:39:00.574596 master-0 kubenswrapper[4790]: I1011 10:39:00.574540 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 11 10:39:00.574596 master-0 kubenswrapper[4790]: I1011 10:39:00.574576 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 11 10:39:00.574596 master-0 kubenswrapper[4790]: I1011 10:39:00.574591 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 11 10:39:00.575735 master-0 kubenswrapper[4790]: I1011 10:39:00.575693 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 11 10:39:00.575938 master-0 kubenswrapper[4790]: I1011 10:39:00.575899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 11 10:39:00.611685 master-0 kubenswrapper[4790]: I1011 10:39:00.611510 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-g99cx"] Oct 11 10:39:00.612237 master-0 kubenswrapper[4790]: I1011 10:39:00.612201 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.616166 master-0 kubenswrapper[4790]: I1011 10:39:00.616118 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 11 10:39:00.616230 master-0 kubenswrapper[4790]: I1011 10:39:00.616171 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 11 10:39:00.616230 master-0 kubenswrapper[4790]: I1011 10:39:00.616189 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-cs6lc" Oct 11 10:39:00.616985 master-0 kubenswrapper[4790]: I1011 10:39:00.616947 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 11 10:39:00.626134 master-0 kubenswrapper[4790]: I1011 10:39:00.626062 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-modprobe-d\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.626196 master-0 kubenswrapper[4790]: I1011 10:39:00.626136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-cnibin\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.626196 master-0 kubenswrapper[4790]: I1011 10:39:00.626172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-k8s-cni-cncf-io\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.626196 master-0 kubenswrapper[4790]: I1011 10:39:00.626194 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-hostroot\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.626292 master-0 kubenswrapper[4790]: I1011 10:39:00.626239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1076411d-ae28-46e4-97ca-9c78203e7aba-mcd-auth-proxy-config\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.626292 master-0 kubenswrapper[4790]: I1011 10:39:00.626261 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-socket-dir-parent\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.626385 master-0 kubenswrapper[4790]: I1011 10:39:00.626334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-kubernetes\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.626442 master-0 kubenswrapper[4790]: I1011 10:39:00.626386 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysctl-conf\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.626442 master-0 kubenswrapper[4790]: I1011 10:39:00.626408 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c1c727b-713a-4dff-ae8b-ad9b9851adae-cni-binary-copy\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.626442 master-0 kubenswrapper[4790]: I1011 10:39:00.626426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-etc-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.626442 master-0 kubenswrapper[4790]: I1011 10:39:00.626443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovnkube-config\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.626943 master-0 kubenswrapper[4790]: I1011 10:39:00.626464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5kd5\" (UniqueName: \"kubernetes.io/projected/bfe05233-94bf-4e16-8c7e-321435ba7f00-kube-api-access-d5kd5\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.626943 master-0 kubenswrapper[4790]: I1011 10:39:00.626483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.627144 master-0 kubenswrapper[4790]: I1011 10:39:00.627045 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-whereabouts-configmap\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.627216 master-0 kubenswrapper[4790]: I1011 10:39:00.627188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/0a2f91f6-f87a-4b69-a47a-91ca827d8386-ovnkube-identity-cm\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.627256 master-0 kubenswrapper[4790]: I1011 10:39:00.627239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-daemon-config\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.627295 master-0 kubenswrapper[4790]: I1011 10:39:00.627281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-kubelet\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.627342 master-0 kubenswrapper[4790]: I1011 10:39:00.627318 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-log-socket\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.627476 master-0 kubenswrapper[4790]: I1011 10:39:00.627359 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.627476 master-0 kubenswrapper[4790]: I1011 10:39:00.627398 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-run\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.627476 master-0 kubenswrapper[4790]: I1011 10:39:00.627433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-system-cni-dir\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.627476 master-0 kubenswrapper[4790]: I1011 10:39:00.627464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0a2f91f6-f87a-4b69-a47a-91ca827d8386-webhook-cert\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.627581 master-0 kubenswrapper[4790]: I1011 10:39:00.627498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-cni-netd\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.627581 master-0 kubenswrapper[4790]: I1011 10:39:00.627534 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-os-release\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.627637 master-0 kubenswrapper[4790]: I1011 10:39:00.627601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bzq\" (UniqueName: \"kubernetes.io/projected/00e9cb61-65c4-4e6a-bb0c-2428529c63bf-kube-api-access-p5bzq\") pod \"node-resolver-5kghv\" (UID: \"00e9cb61-65c4-4e6a-bb0c-2428529c63bf\") " pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.627744 master-0 kubenswrapper[4790]: I1011 10:39:00.627695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-cni-bin\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.627788 master-0 kubenswrapper[4790]: I1011 10:39:00.627754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-conf-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.627788 master-0 kubenswrapper[4790]: I1011 10:39:00.627776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-slash\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.627845 master-0 kubenswrapper[4790]: I1011 10:39:00.627802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-ovn\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.627845 master-0 kubenswrapper[4790]: I1011 10:39:00.627823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-lib-modules\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.627845 master-0 kubenswrapper[4790]: I1011 10:39:00.627840 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcq86\" (UniqueName: \"kubernetes.io/projected/1076411d-ae28-46e4-97ca-9c78203e7aba-kube-api-access-kcq86\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.627928 master-0 kubenswrapper[4790]: I1011 10:39:00.627863 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-cni-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.628065 master-0 kubenswrapper[4790]: I1011 10:39:00.627981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a2f91f6-f87a-4b69-a47a-91ca827d8386-env-overrides\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.628152 master-0 kubenswrapper[4790]: I1011 10:39:00.628117 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00e9cb61-65c4-4e6a-bb0c-2428529c63bf-hosts-file\") pod \"node-resolver-5kghv\" (UID: \"00e9cb61-65c4-4e6a-bb0c-2428529c63bf\") " pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.628226 master-0 kubenswrapper[4790]: I1011 10:39:00.628170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.628264 master-0 kubenswrapper[4790]: I1011 10:39:00.628246 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99wx\" (UniqueName: \"kubernetes.io/projected/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-kube-api-access-l99wx\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.628346 master-0 kubenswrapper[4790]: I1011 10:39:00.628321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-systemd\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.628417 master-0 kubenswrapper[4790]: I1011 10:39:00.628391 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfe05233-94bf-4e16-8c7e-321435ba7f00-tmp\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.628495 master-0 kubenswrapper[4790]: I1011 10:39:00.628436 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cnibin\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.628536 master-0 kubenswrapper[4790]: I1011 10:39:00.628514 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mptfd\" (UniqueName: \"kubernetes.io/projected/24d4b452-8f49-4e9e-98b6-3429afefc4c4-kube-api-access-mptfd\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.628613 master-0 kubenswrapper[4790]: I1011 10:39:00.628587 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-os-release\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.628680 master-0 kubenswrapper[4790]: I1011 10:39:00.628656 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-netns\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.628780 master-0 kubenswrapper[4790]: I1011 10:39:00.628750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-cni-multus\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.628864 master-0 kubenswrapper[4790]: I1011 10:39:00.628798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zswxl\" (UniqueName: \"kubernetes.io/projected/8c1c727b-713a-4dff-ae8b-ad9b9851adae-kube-api-access-zswxl\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.628909 master-0 kubenswrapper[4790]: I1011 10:39:00.628881 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-run-ovn-kubernetes\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.628973 master-0 kubenswrapper[4790]: I1011 10:39:00.628949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-cni-bin\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629045 master-0 kubenswrapper[4790]: I1011 10:39:00.629020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-var-lib-kubelet\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.629090 master-0 kubenswrapper[4790]: I1011 10:39:00.629062 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-run-netns\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629159 master-0 kubenswrapper[4790]: I1011 10:39:00.629135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-systemd\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629248 master-0 kubenswrapper[4790]: I1011 10:39:00.629224 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-env-overrides\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629309 master-0 kubenswrapper[4790]: I1011 10:39:00.629282 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovn-node-metrics-cert\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629369 master-0 kubenswrapper[4790]: I1011 10:39:00.629345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-sys\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.629426 master-0 kubenswrapper[4790]: I1011 10:39:00.629382 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-host\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.629426 master-0 kubenswrapper[4790]: I1011 10:39:00.629413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-tuned\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.629481 master-0 kubenswrapper[4790]: I1011 10:39:00.629446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-kubelet\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.629510 master-0 kubenswrapper[4790]: I1011 10:39:00.629476 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-var-lib-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629537 master-0 kubenswrapper[4790]: I1011 10:39:00.629509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovnkube-script-lib\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629663 master-0 kubenswrapper[4790]: I1011 10:39:00.629539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-system-cni-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.629663 master-0 kubenswrapper[4790]: I1011 10:39:00.629571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-systemd-units\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629663 master-0 kubenswrapper[4790]: I1011 10:39:00.629601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-node-log\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.629663 master-0 kubenswrapper[4790]: I1011 10:39:00.629629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysconfig\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.629791 master-0 kubenswrapper[4790]: I1011 10:39:00.629680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1076411d-ae28-46e4-97ca-9c78203e7aba-rootfs\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.629791 master-0 kubenswrapper[4790]: I1011 10:39:00.629735 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.629791 master-0 kubenswrapper[4790]: I1011 10:39:00.629781 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-multus-certs\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.629902 master-0 kubenswrapper[4790]: I1011 10:39:00.629812 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysctl-d\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.629902 master-0 kubenswrapper[4790]: I1011 10:39:00.629856 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1076411d-ae28-46e4-97ca-9c78203e7aba-proxy-tls\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.629902 master-0 kubenswrapper[4790]: I1011 10:39:00.629888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.629998 master-0 kubenswrapper[4790]: I1011 10:39:00.629918 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5w5m\" (UniqueName: \"kubernetes.io/projected/0a2f91f6-f87a-4b69-a47a-91ca827d8386-kube-api-access-p5w5m\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.629998 master-0 kubenswrapper[4790]: I1011 10:39:00.629947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-etc-kubernetes\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.644277 master-0 kubenswrapper[4790]: I1011 10:39:00.644227 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zcc4t"] Oct 11 10:39:00.644785 master-0 kubenswrapper[4790]: I1011 10:39:00.644669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:00.644785 master-0 kubenswrapper[4790]: E1011 10:39:00.644764 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:00.645013 master-0 kubenswrapper[4790]: I1011 10:39:00.644974 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-bn2sv"] Oct 11 10:39:00.645653 master-0 kubenswrapper[4790]: I1011 10:39:00.645610 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:00.645825 master-0 kubenswrapper[4790]: E1011 10:39:00.645781 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:00.647375 master-0 kubenswrapper[4790]: I1011 10:39:00.647318 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-l66k2"] Oct 11 10:39:00.647977 master-0 kubenswrapper[4790]: I1011 10:39:00.647943 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.651823 master-0 kubenswrapper[4790]: I1011 10:39:00.651781 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Oct 11 10:39:00.652106 master-0 kubenswrapper[4790]: I1011 10:39:00.652056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Oct 11 10:39:00.652402 master-0 kubenswrapper[4790]: I1011 10:39:00.652369 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Oct 11 10:39:00.653880 master-0 kubenswrapper[4790]: I1011 10:39:00.653843 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Oct 11 10:39:00.654013 master-0 kubenswrapper[4790]: I1011 10:39:00.653978 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-vhtwz" Oct 11 10:39:00.655527 master-0 kubenswrapper[4790]: I1011 10:39:00.655487 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Oct 11 10:39:00.730982 master-0 kubenswrapper[4790]: I1011 10:39:00.730903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-var-lib-kubelet\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.730982 master-0 kubenswrapper[4790]: I1011 10:39:00.730955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-run-netns\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.730982 master-0 kubenswrapper[4790]: I1011 10:39:00.730973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-systemd\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.730982 master-0 kubenswrapper[4790]: I1011 10:39:00.730991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-env-overrides\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovn-node-metrics-cert\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731039 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-root\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731057 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-textfile\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731065 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-var-lib-kubelet\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-sys\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-host\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-run-netns\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-systemd\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-tuned\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-kubelet\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731233 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-sys\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-var-lib-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-kubelet\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731296 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovnkube-script-lib\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731317 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-system-cni-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-var-lib-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-systemd-units\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.731343 master-0 kubenswrapper[4790]: I1011 10:39:00.731365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-node-log\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731374 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-systemd-units\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysconfig\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-system-cni-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-node-log\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1076411d-ae28-46e4-97ca-9c78203e7aba-rootfs\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731439 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1076411d-ae28-46e4-97ca-9c78203e7aba-rootfs\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysconfig\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731510 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qvq\" (UniqueName: \"kubernetes.io/projected/4e2d32e6-3363-4389-ad6a-cfd917e568d2-kube-api-access-n5qvq\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731536 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-tls\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-multus-certs\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnhq6\" (UniqueName: \"kubernetes.io/projected/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-kube-api-access-gnhq6\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysctl-d\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-multus-certs\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731645 4790 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 11 10:39:00.732363 master-0 kubenswrapper[4790]: I1011 10:39:00.731771 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysctl-d\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.731766 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1076411d-ae28-46e4-97ca-9c78203e7aba-proxy-tls\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.731815 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.731289 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-host\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.731844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5w5m\" (UniqueName: \"kubernetes.io/projected/0a2f91f6-f87a-4b69-a47a-91ca827d8386-kube-api-access-p5w5m\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732094 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732139 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-etc-kubernetes\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e2d32e6-3363-4389-ad6a-cfd917e568d2-serviceca\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732216 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xsk\" (UniqueName: \"kubernetes.io/projected/7d9f4c3d-57bd-49f6-94f2-47670b385318-kube-api-access-s4xsk\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-etc-kubernetes\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732268 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-modprobe-d\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-cnibin\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732314 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-env-overrides\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732369 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-cnibin\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-k8s-cni-cncf-io\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732447 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-k8s-cni-cncf-io\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.733377 master-0 kubenswrapper[4790]: I1011 10:39:00.732458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-modprobe-d\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-hostroot\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-hostroot\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-wtmp\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1076411d-ae28-46e4-97ca-9c78203e7aba-mcd-auth-proxy-config\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-socket-dir-parent\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-sys\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.732649 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-kubernetes\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysctl-conf\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c1c727b-713a-4dff-ae8b-ad9b9851adae-cni-binary-copy\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733669 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-etc-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733698 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovnkube-config\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovnkube-script-lib\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733777 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5kd5\" (UniqueName: \"kubernetes.io/projected/bfe05233-94bf-4e16-8c7e-321435ba7f00-kube-api-access-d5kd5\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733806 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-whereabouts-configmap\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.734437 master-0 kubenswrapper[4790]: I1011 10:39:00.733875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/0a2f91f6-f87a-4b69-a47a-91ca827d8386-ovnkube-identity-cm\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.733880 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-etc-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.733896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-daemon-config\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1076411d-ae28-46e4-97ca-9c78203e7aba-mcd-auth-proxy-config\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-sysctl-conf\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-kubelet\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734569 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-socket-dir-parent\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734587 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-log-socket\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734636 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-kubelet\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734638 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734698 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-run\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-kubernetes\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-system-cni-dir\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-system-cni-dir\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734854 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-run\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.735510 master-0 kubenswrapper[4790]: I1011 10:39:00.734869 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0a2f91f6-f87a-4b69-a47a-91ca827d8386-webhook-cert\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.734942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-cni-netd\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.734976 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-daemon-config\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.734991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d9f4c3d-57bd-49f6-94f2-47670b385318-metrics-client-ca\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.734888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-log-socket\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-os-release\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-cni-netd\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bzq\" (UniqueName: \"kubernetes.io/projected/00e9cb61-65c4-4e6a-bb0c-2428529c63bf-kube-api-access-p5bzq\") pod \"node-resolver-5kghv\" (UID: \"00e9cb61-65c4-4e6a-bb0c-2428529c63bf\") " pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735165 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-cni-bin\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735196 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-os-release\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-conf-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-slash\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735299 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-cni-bin\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735319 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-conf-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735337 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-ovn\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-slash\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-whereabouts-configmap\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735408 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-lib-modules\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.736448 master-0 kubenswrapper[4790]: I1011 10:39:00.735451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-ovn\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcq86\" (UniqueName: \"kubernetes.io/projected/1076411d-ae28-46e4-97ca-9c78203e7aba-kube-api-access-kcq86\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735506 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-cni-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-lib-modules\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a2f91f6-f87a-4b69-a47a-91ca827d8386-env-overrides\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735619 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovnkube-config\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735690 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-multus-cni-dir\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00e9cb61-65c4-4e6a-bb0c-2428529c63bf-hosts-file\") pod \"node-resolver-5kghv\" (UID: \"00e9cb61-65c4-4e6a-bb0c-2428529c63bf\") " pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/0a2f91f6-f87a-4b69-a47a-91ca827d8386-ovnkube-identity-cm\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c1c727b-713a-4dff-ae8b-ad9b9851adae-cni-binary-copy\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735867 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99wx\" (UniqueName: \"kubernetes.io/projected/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-kube-api-access-l99wx\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00e9cb61-65c4-4e6a-bb0c-2428529c63bf-hosts-file\") pod \"node-resolver-5kghv\" (UID: \"00e9cb61-65c4-4e6a-bb0c-2428529c63bf\") " pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-systemd\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.735936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-run-openvswitch\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.736076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfe05233-94bf-4e16-8c7e-321435ba7f00-tmp\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.736091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-systemd\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.737422 master-0 kubenswrapper[4790]: I1011 10:39:00.736104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a2f91f6-f87a-4b69-a47a-91ca827d8386-env-overrides\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cnibin\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736295 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mptfd\" (UniqueName: \"kubernetes.io/projected/24d4b452-8f49-4e9e-98b6-3429afefc4c4-kube-api-access-mptfd\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736459 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-os-release\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-netns\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-cni-multus\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736796 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cnibin\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736876 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zswxl\" (UniqueName: \"kubernetes.io/projected/8c1c727b-713a-4dff-ae8b-ad9b9851adae-kube-api-access-zswxl\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-run-ovn-kubernetes\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-cni-bin\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.736977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-var-lib-cni-multus\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.737092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-os-release\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.737112 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-cni-bin\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.737124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24d4b452-8f49-4e9e-98b6-3429afefc4c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.737236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e2d32e6-3363-4389-ad6a-cfd917e568d2-host\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.737164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-host-run-ovn-kubernetes\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.738581 master-0 kubenswrapper[4790]: I1011 10:39:00.737168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c1c727b-713a-4dff-ae8b-ad9b9851adae-host-run-netns\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.739601 master-0 kubenswrapper[4790]: I1011 10:39:00.739008 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-ovn-node-metrics-cert\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.739730 master-0 kubenswrapper[4790]: I1011 10:39:00.739019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfe05233-94bf-4e16-8c7e-321435ba7f00-etc-tuned\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.741144 master-0 kubenswrapper[4790]: I1011 10:39:00.741089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1076411d-ae28-46e4-97ca-9c78203e7aba-proxy-tls\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.741886 master-0 kubenswrapper[4790]: I1011 10:39:00.741809 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0a2f91f6-f87a-4b69-a47a-91ca827d8386-webhook-cert\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.742080 master-0 kubenswrapper[4790]: I1011 10:39:00.742022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfe05233-94bf-4e16-8c7e-321435ba7f00-tmp\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.762434 master-0 kubenswrapper[4790]: I1011 10:39:00.762375 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5w5m\" (UniqueName: \"kubernetes.io/projected/0a2f91f6-f87a-4b69-a47a-91ca827d8386-kube-api-access-p5w5m\") pod \"network-node-identity-kh4ld\" (UID: \"0a2f91f6-f87a-4b69-a47a-91ca827d8386\") " pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.764985 master-0 kubenswrapper[4790]: I1011 10:39:00.764933 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcq86\" (UniqueName: \"kubernetes.io/projected/1076411d-ae28-46e4-97ca-9c78203e7aba-kube-api-access-kcq86\") pod \"machine-config-daemon-8lkdg\" (UID: \"1076411d-ae28-46e4-97ca-9c78203e7aba\") " pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.766899 master-0 kubenswrapper[4790]: I1011 10:39:00.766840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bzq\" (UniqueName: \"kubernetes.io/projected/00e9cb61-65c4-4e6a-bb0c-2428529c63bf-kube-api-access-p5bzq\") pod \"node-resolver-5kghv\" (UID: \"00e9cb61-65c4-4e6a-bb0c-2428529c63bf\") " pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.767235 master-0 kubenswrapper[4790]: I1011 10:39:00.767185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mptfd\" (UniqueName: \"kubernetes.io/projected/24d4b452-8f49-4e9e-98b6-3429afefc4c4-kube-api-access-mptfd\") pod \"multus-additional-cni-plugins-ft6fv\" (UID: \"24d4b452-8f49-4e9e-98b6-3429afefc4c4\") " pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.767636 master-0 kubenswrapper[4790]: I1011 10:39:00.767597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99wx\" (UniqueName: \"kubernetes.io/projected/417d5cfd-0cf3-4d96-b901-fcfe4f742ca5-kube-api-access-l99wx\") pod \"ovnkube-node-96nq6\" (UID: \"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5\") " pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.770866 master-0 kubenswrapper[4790]: I1011 10:39:00.770825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zswxl\" (UniqueName: \"kubernetes.io/projected/8c1c727b-713a-4dff-ae8b-ad9b9851adae-kube-api-access-zswxl\") pod \"multus-r499q\" (UID: \"8c1c727b-713a-4dff-ae8b-ad9b9851adae\") " pod="openshift-multus/multus-r499q" Oct 11 10:39:00.772259 master-0 kubenswrapper[4790]: I1011 10:39:00.772202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5kd5\" (UniqueName: \"kubernetes.io/projected/bfe05233-94bf-4e16-8c7e-321435ba7f00-kube-api-access-d5kd5\") pod \"tuned-85bvx\" (UID: \"bfe05233-94bf-4e16-8c7e-321435ba7f00\") " pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.837843 master-0 kubenswrapper[4790]: I1011 10:39:00.837770 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.837843 master-0 kubenswrapper[4790]: I1011 10:39:00.837847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qvq\" (UniqueName: \"kubernetes.io/projected/4e2d32e6-3363-4389-ad6a-cfd917e568d2-kube-api-access-n5qvq\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.837871 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-tls\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.837889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnhq6\" (UniqueName: \"kubernetes.io/projected/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-kube-api-access-gnhq6\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.837907 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e2d32e6-3363-4389-ad6a-cfd917e568d2-serviceca\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.837926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.837944 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xsk\" (UniqueName: \"kubernetes.io/projected/7d9f4c3d-57bd-49f6-94f2-47670b385318-kube-api-access-s4xsk\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.837991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-wtmp\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.838021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-sys\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.838076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.838100 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d9f4c3d-57bd-49f6-94f2-47670b385318-metrics-client-ca\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.838166 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e2d32e6-3363-4389-ad6a-cfd917e568d2-host\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.838174 master-0 kubenswrapper[4790]: I1011 10:39:00.838191 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-root\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838796 master-0 kubenswrapper[4790]: I1011 10:39:00.838211 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-textfile\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838796 master-0 kubenswrapper[4790]: I1011 10:39:00.838452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-wtmp\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.838796 master-0 kubenswrapper[4790]: I1011 10:39:00.838638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e2d32e6-3363-4389-ad6a-cfd917e568d2-host\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.838900 master-0 kubenswrapper[4790]: E1011 10:39:00.838872 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:00.839000 master-0 kubenswrapper[4790]: I1011 10:39:00.838931 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-sys\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.839000 master-0 kubenswrapper[4790]: E1011 10:39:00.838950 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:01.338926142 +0000 UTC m=+17.893386664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:00.839124 master-0 kubenswrapper[4790]: I1011 10:39:00.839012 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e2d32e6-3363-4389-ad6a-cfd917e568d2-serviceca\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.839124 master-0 kubenswrapper[4790]: I1011 10:39:00.839031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7d9f4c3d-57bd-49f6-94f2-47670b385318-root\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.839124 master-0 kubenswrapper[4790]: I1011 10:39:00.839100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-textfile\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.840167 master-0 kubenswrapper[4790]: I1011 10:39:00.840121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d9f4c3d-57bd-49f6-94f2-47670b385318-metrics-client-ca\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.841774 master-0 kubenswrapper[4790]: I1011 10:39:00.841725 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.841994 master-0 kubenswrapper[4790]: I1011 10:39:00.841838 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d9f4c3d-57bd-49f6-94f2-47670b385318-node-exporter-tls\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.858262 master-0 kubenswrapper[4790]: I1011 10:39:00.858225 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qvq\" (UniqueName: \"kubernetes.io/projected/4e2d32e6-3363-4389-ad6a-cfd917e568d2-kube-api-access-n5qvq\") pod \"node-ca-g99cx\" (UID: \"4e2d32e6-3363-4389-ad6a-cfd917e568d2\") " pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:00.870561 master-0 kubenswrapper[4790]: I1011 10:39:00.870512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnhq6\" (UniqueName: \"kubernetes.io/projected/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-kube-api-access-gnhq6\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:00.878053 master-0 kubenswrapper[4790]: E1011 10:39:00.877990 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:00.878053 master-0 kubenswrapper[4790]: E1011 10:39:00.878025 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:00.878053 master-0 kubenswrapper[4790]: E1011 10:39:00.878042 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:00.878270 master-0 kubenswrapper[4790]: E1011 10:39:00.878103 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:01.378083387 +0000 UTC m=+17.932543689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:00.882207 master-0 kubenswrapper[4790]: I1011 10:39:00.882159 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5kghv" Oct 11 10:39:00.885545 master-0 kubenswrapper[4790]: I1011 10:39:00.885491 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4xsk\" (UniqueName: \"kubernetes.io/projected/7d9f4c3d-57bd-49f6-94f2-47670b385318-kube-api-access-s4xsk\") pod \"node-exporter-l66k2\" (UID: \"7d9f4c3d-57bd-49f6-94f2-47670b385318\") " pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:00.889685 master-0 kubenswrapper[4790]: I1011 10:39:00.889625 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" Oct 11 10:39:00.906416 master-0 kubenswrapper[4790]: I1011 10:39:00.906356 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-85bvx" Oct 11 10:39:00.907238 master-0 kubenswrapper[4790]: W1011 10:39:00.906925 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1076411d_ae28_46e4_97ca_9c78203e7aba.slice/crio-3ce18fa8f5bbc0577ffd95d727f9818328c95a8e3e9dee18bd6c5f018241a8f5 WatchSource:0}: Error finding container 3ce18fa8f5bbc0577ffd95d727f9818328c95a8e3e9dee18bd6c5f018241a8f5: Status 404 returned error can't find the container with id 3ce18fa8f5bbc0577ffd95d727f9818328c95a8e3e9dee18bd6c5f018241a8f5 Oct 11 10:39:00.918304 master-0 kubenswrapper[4790]: W1011 10:39:00.918238 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe05233_94bf_4e16_8c7e_321435ba7f00.slice/crio-6ec1ae3f7ba3b832a6f74bed193851690928b4c9fb1f1e5b2c927a922be00ea8 WatchSource:0}: Error finding container 6ec1ae3f7ba3b832a6f74bed193851690928b4c9fb1f1e5b2c927a922be00ea8: Status 404 returned error can't find the container with id 6ec1ae3f7ba3b832a6f74bed193851690928b4c9fb1f1e5b2c927a922be00ea8 Oct 11 10:39:00.934388 master-0 kubenswrapper[4790]: I1011 10:39:00.934350 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r499q" Oct 11 10:39:00.956892 master-0 kubenswrapper[4790]: I1011 10:39:00.956826 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" Oct 11 10:39:00.958824 master-0 kubenswrapper[4790]: W1011 10:39:00.958747 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c1c727b_713a_4dff_ae8b_ad9b9851adae.slice/crio-8b87d6351e36c46ff6fb7661e0cca8b56ff438053d98cd170a47fe2544b6bf78 WatchSource:0}: Error finding container 8b87d6351e36c46ff6fb7661e0cca8b56ff438053d98cd170a47fe2544b6bf78: Status 404 returned error can't find the container with id 8b87d6351e36c46ff6fb7661e0cca8b56ff438053d98cd170a47fe2544b6bf78 Oct 11 10:39:00.983778 master-0 kubenswrapper[4790]: I1011 10:39:00.983701 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-kh4ld" Oct 11 10:39:00.991835 master-0 kubenswrapper[4790]: I1011 10:39:00.991666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:00.996604 master-0 kubenswrapper[4790]: W1011 10:39:00.996143 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a2f91f6_f87a_4b69_a47a_91ca827d8386.slice/crio-99c07ccfcdb0c1c10e6734b3cff3a1985e39f219858ea25caacb4c0621d10a80 WatchSource:0}: Error finding container 99c07ccfcdb0c1c10e6734b3cff3a1985e39f219858ea25caacb4c0621d10a80: Status 404 returned error can't find the container with id 99c07ccfcdb0c1c10e6734b3cff3a1985e39f219858ea25caacb4c0621d10a80 Oct 11 10:39:01.008497 master-0 kubenswrapper[4790]: W1011 10:39:01.008429 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417d5cfd_0cf3_4d96_b901_fcfe4f742ca5.slice/crio-3d8ef793e5a5ca476c1f291d9d684fa9ca870e0fff4eff886589819a876c33d6 WatchSource:0}: Error finding container 3d8ef793e5a5ca476c1f291d9d684fa9ca870e0fff4eff886589819a876c33d6: Status 404 returned error can't find the container with id 3d8ef793e5a5ca476c1f291d9d684fa9ca870e0fff4eff886589819a876c33d6 Oct 11 10:39:01.015683 master-0 kubenswrapper[4790]: I1011 10:39:01.015576 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g99cx" Oct 11 10:39:01.020947 master-0 kubenswrapper[4790]: I1011 10:39:01.020871 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l66k2" Oct 11 10:39:01.035402 master-0 kubenswrapper[4790]: W1011 10:39:01.035329 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2d32e6_3363_4389_ad6a_cfd917e568d2.slice/crio-5e845fde7720879c7db8961ea19d160365f000111cc54971bf7e8c7757b7e0a9 WatchSource:0}: Error finding container 5e845fde7720879c7db8961ea19d160365f000111cc54971bf7e8c7757b7e0a9: Status 404 returned error can't find the container with id 5e845fde7720879c7db8961ea19d160365f000111cc54971bf7e8c7757b7e0a9 Oct 11 10:39:01.036779 master-0 kubenswrapper[4790]: W1011 10:39:01.036418 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d9f4c3d_57bd_49f6_94f2_47670b385318.slice/crio-bf65362b1aa13d897033e36adeefa4f842093b6cac0a70f775f250b0d47f7b3e WatchSource:0}: Error finding container bf65362b1aa13d897033e36adeefa4f842093b6cac0a70f775f250b0d47f7b3e: Status 404 returned error can't find the container with id bf65362b1aa13d897033e36adeefa4f842093b6cac0a70f775f250b0d47f7b3e Oct 11 10:39:01.343428 master-0 kubenswrapper[4790]: I1011 10:39:01.343344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:01.344365 master-0 kubenswrapper[4790]: E1011 10:39:01.343596 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:01.344365 master-0 kubenswrapper[4790]: E1011 10:39:01.343686 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:02.343656449 +0000 UTC m=+18.898116741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:01.348069 master-0 kubenswrapper[4790]: I1011 10:39:01.348001 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5kghv" event={"ID":"00e9cb61-65c4-4e6a-bb0c-2428529c63bf","Type":"ContainerStarted","Data":"6b840ff8900b85e6283f49aa581413b60cfc22d11df8f95161f47cca0d1657d7"} Oct 11 10:39:01.350732 master-0 kubenswrapper[4790]: I1011 10:39:01.350627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r499q" event={"ID":"8c1c727b-713a-4dff-ae8b-ad9b9851adae","Type":"ContainerStarted","Data":"8b87d6351e36c46ff6fb7661e0cca8b56ff438053d98cd170a47fe2544b6bf78"} Oct 11 10:39:01.352754 master-0 kubenswrapper[4790]: I1011 10:39:01.352669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l66k2" event={"ID":"7d9f4c3d-57bd-49f6-94f2-47670b385318","Type":"ContainerStarted","Data":"bf65362b1aa13d897033e36adeefa4f842093b6cac0a70f775f250b0d47f7b3e"} Oct 11 10:39:01.354427 master-0 kubenswrapper[4790]: I1011 10:39:01.354365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g99cx" event={"ID":"4e2d32e6-3363-4389-ad6a-cfd917e568d2","Type":"ContainerStarted","Data":"5e845fde7720879c7db8961ea19d160365f000111cc54971bf7e8c7757b7e0a9"} Oct 11 10:39:01.355736 master-0 kubenswrapper[4790]: I1011 10:39:01.355630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"3d8ef793e5a5ca476c1f291d9d684fa9ca870e0fff4eff886589819a876c33d6"} Oct 11 10:39:01.356818 master-0 kubenswrapper[4790]: I1011 10:39:01.356771 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kh4ld" event={"ID":"0a2f91f6-f87a-4b69-a47a-91ca827d8386","Type":"ContainerStarted","Data":"99c07ccfcdb0c1c10e6734b3cff3a1985e39f219858ea25caacb4c0621d10a80"} Oct 11 10:39:01.358084 master-0 kubenswrapper[4790]: I1011 10:39:01.357929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-85bvx" event={"ID":"bfe05233-94bf-4e16-8c7e-321435ba7f00","Type":"ContainerStarted","Data":"6ec1ae3f7ba3b832a6f74bed193851690928b4c9fb1f1e5b2c927a922be00ea8"} Oct 11 10:39:01.360400 master-0 kubenswrapper[4790]: I1011 10:39:01.360330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" event={"ID":"1076411d-ae28-46e4-97ca-9c78203e7aba","Type":"ContainerStarted","Data":"29d9bf6586931cd43550ae895256ff3093100c55fe6b5e2843d696b112b149af"} Oct 11 10:39:01.360400 master-0 kubenswrapper[4790]: I1011 10:39:01.360358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" event={"ID":"1076411d-ae28-46e4-97ca-9c78203e7aba","Type":"ContainerStarted","Data":"0620cf16da36928a13e95e840388b3e5cfd335c5d5355474658f54b96c75c15f"} Oct 11 10:39:01.360400 master-0 kubenswrapper[4790]: I1011 10:39:01.360369 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" event={"ID":"1076411d-ae28-46e4-97ca-9c78203e7aba","Type":"ContainerStarted","Data":"3ce18fa8f5bbc0577ffd95d727f9818328c95a8e3e9dee18bd6c5f018241a8f5"} Oct 11 10:39:01.362141 master-0 kubenswrapper[4790]: I1011 10:39:01.362085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerStarted","Data":"b95a93cafe5555685cb6e03ef19e23795847d7899f10c93f94dfce6df82aba47"} Oct 11 10:39:01.391622 master-0 kubenswrapper[4790]: I1011 10:39:01.391494 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8lkdg" podStartSLOduration=11.391470627 podStartE2EDuration="11.391470627s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:39:01.391239111 +0000 UTC m=+17.945699433" watchObservedRunningTime="2025-10-11 10:39:01.391470627 +0000 UTC m=+17.945930919" Oct 11 10:39:01.444597 master-0 kubenswrapper[4790]: I1011 10:39:01.444378 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:01.445076 master-0 kubenswrapper[4790]: E1011 10:39:01.444653 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:01.445076 master-0 kubenswrapper[4790]: E1011 10:39:01.444736 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:01.445076 master-0 kubenswrapper[4790]: E1011 10:39:01.444761 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:01.445076 master-0 kubenswrapper[4790]: E1011 10:39:01.444866 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:02.444835827 +0000 UTC m=+18.999296159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:01.934323 master-0 kubenswrapper[4790]: I1011 10:39:01.934255 4790 ???:1] "http: TLS handshake error from 192.168.34.12:42632: no serving certificate available for the kubelet" Oct 11 10:39:02.292050 master-0 kubenswrapper[4790]: I1011 10:39:02.292003 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:02.292181 master-0 kubenswrapper[4790]: E1011 10:39:02.292149 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:02.293152 master-0 kubenswrapper[4790]: I1011 10:39:02.292424 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:02.293152 master-0 kubenswrapper[4790]: I1011 10:39:02.292601 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:02.293152 master-0 kubenswrapper[4790]: E1011 10:39:02.293064 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:02.293348 master-0 kubenswrapper[4790]: E1011 10:39:02.293285 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-5-master-0" podUID="795a4c8d-2d06-412c-a788-7e8585d432f7" Oct 11 10:39:02.353105 master-0 kubenswrapper[4790]: I1011 10:39:02.353046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:02.353980 master-0 kubenswrapper[4790]: E1011 10:39:02.353258 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:02.353980 master-0 kubenswrapper[4790]: E1011 10:39:02.353350 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:04.35332329 +0000 UTC m=+20.907783582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:02.453761 master-0 kubenswrapper[4790]: I1011 10:39:02.453411 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:02.453761 master-0 kubenswrapper[4790]: E1011 10:39:02.453656 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:02.453761 master-0 kubenswrapper[4790]: E1011 10:39:02.453697 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:02.453761 master-0 kubenswrapper[4790]: E1011 10:39:02.453726 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:02.453761 master-0 kubenswrapper[4790]: E1011 10:39:02.453795 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:04.453774319 +0000 UTC m=+21.008234611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:03.399898 master-0 kubenswrapper[4790]: I1011 10:39:03.399826 4790 generic.go:334] "Generic (PLEG): container finished" podID="7d9f4c3d-57bd-49f6-94f2-47670b385318" containerID="1dcd42b41b2999d5139ed007f6535fac65ea9418fe9313b04e66f09cfb1775ec" exitCode=0 Oct 11 10:39:03.399898 master-0 kubenswrapper[4790]: I1011 10:39:03.399900 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l66k2" event={"ID":"7d9f4c3d-57bd-49f6-94f2-47670b385318","Type":"ContainerDied","Data":"1dcd42b41b2999d5139ed007f6535fac65ea9418fe9313b04e66f09cfb1775ec"} Oct 11 10:39:04.291969 master-0 kubenswrapper[4790]: I1011 10:39:04.291838 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:04.292966 master-0 kubenswrapper[4790]: I1011 10:39:04.292919 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:04.293107 master-0 kubenswrapper[4790]: I1011 10:39:04.292973 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:04.293107 master-0 kubenswrapper[4790]: E1011 10:39:04.293055 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-5-master-0" podUID="795a4c8d-2d06-412c-a788-7e8585d432f7" Oct 11 10:39:04.293270 master-0 kubenswrapper[4790]: E1011 10:39:04.293150 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:04.293270 master-0 kubenswrapper[4790]: I1011 10:39:04.293238 4790 scope.go:117] "RemoveContainer" containerID="0dce5d6b10f720448939fdc7754f0953d8f5becf80952889b528531812732245" Oct 11 10:39:04.293431 master-0 kubenswrapper[4790]: E1011 10:39:04.293279 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:04.369558 master-0 kubenswrapper[4790]: I1011 10:39:04.369415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") pod \"installer-5-master-0\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:04.369558 master-0 kubenswrapper[4790]: I1011 10:39:04.369472 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:04.370073 master-0 kubenswrapper[4790]: E1011 10:39:04.369612 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:04.370073 master-0 kubenswrapper[4790]: E1011 10:39:04.369679 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:04.370073 master-0 kubenswrapper[4790]: E1011 10:39:04.369744 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-5-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:04.370073 master-0 kubenswrapper[4790]: E1011 10:39:04.369689 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:08.369667806 +0000 UTC m=+24.924128098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:04.370073 master-0 kubenswrapper[4790]: E1011 10:39:04.369843 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access podName:795a4c8d-2d06-412c-a788-7e8585d432f7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:12.36981668 +0000 UTC m=+28.924277032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access") pod "installer-5-master-0" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:04.470422 master-0 kubenswrapper[4790]: I1011 10:39:04.470307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:04.471531 master-0 kubenswrapper[4790]: E1011 10:39:04.470545 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:04.471531 master-0 kubenswrapper[4790]: E1011 10:39:04.470583 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:04.471531 master-0 kubenswrapper[4790]: E1011 10:39:04.470597 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:04.471531 master-0 kubenswrapper[4790]: E1011 10:39:04.470675 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:08.470653169 +0000 UTC m=+25.025113451 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:05.278434 master-0 kubenswrapper[4790]: I1011 10:39:05.278375 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-5-master-0"] Oct 11 10:39:05.278918 master-0 kubenswrapper[4790]: I1011 10:39:05.278698 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:05.288019 master-0 kubenswrapper[4790]: I1011 10:39:05.287989 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:05.378135 master-0 kubenswrapper[4790]: I1011 10:39:05.378079 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-kubelet-dir\") pod \"795a4c8d-2d06-412c-a788-7e8585d432f7\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " Oct 11 10:39:05.378135 master-0 kubenswrapper[4790]: I1011 10:39:05.378161 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-var-lock\") pod \"795a4c8d-2d06-412c-a788-7e8585d432f7\" (UID: \"795a4c8d-2d06-412c-a788-7e8585d432f7\") " Oct 11 10:39:05.378456 master-0 kubenswrapper[4790]: I1011 10:39:05.378229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "795a4c8d-2d06-412c-a788-7e8585d432f7" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:39:05.378456 master-0 kubenswrapper[4790]: I1011 10:39:05.378299 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-var-lock" (OuterVolumeSpecName: "var-lock") pod "795a4c8d-2d06-412c-a788-7e8585d432f7" (UID: "795a4c8d-2d06-412c-a788-7e8585d432f7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:39:05.378521 master-0 kubenswrapper[4790]: I1011 10:39:05.378486 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:05.378521 master-0 kubenswrapper[4790]: I1011 10:39:05.378515 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/795a4c8d-2d06-412c-a788-7e8585d432f7-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:05.405098 master-0 kubenswrapper[4790]: I1011 10:39:05.405051 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-0" Oct 11 10:39:05.494187 master-0 kubenswrapper[4790]: I1011 10:39:05.494139 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-5-master-0"] Oct 11 10:39:05.503602 master-0 kubenswrapper[4790]: I1011 10:39:05.503557 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-5-master-0"] Oct 11 10:39:05.580802 master-0 kubenswrapper[4790]: I1011 10:39:05.580743 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/795a4c8d-2d06-412c-a788-7e8585d432f7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:06.291917 master-0 kubenswrapper[4790]: I1011 10:39:06.291834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:06.292155 master-0 kubenswrapper[4790]: I1011 10:39:06.291837 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:06.292155 master-0 kubenswrapper[4790]: E1011 10:39:06.292056 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:06.292252 master-0 kubenswrapper[4790]: E1011 10:39:06.292163 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:08.292174 master-0 kubenswrapper[4790]: I1011 10:39:08.292098 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:08.292803 master-0 kubenswrapper[4790]: I1011 10:39:08.292094 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:08.292803 master-0 kubenswrapper[4790]: E1011 10:39:08.292251 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:08.292803 master-0 kubenswrapper[4790]: E1011 10:39:08.292423 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:08.408264 master-0 kubenswrapper[4790]: I1011 10:39:08.408215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:08.408631 master-0 kubenswrapper[4790]: E1011 10:39:08.408480 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:08.408631 master-0 kubenswrapper[4790]: E1011 10:39:08.408592 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:16.408564357 +0000 UTC m=+32.963024819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:08.509421 master-0 kubenswrapper[4790]: I1011 10:39:08.509359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:08.509672 master-0 kubenswrapper[4790]: E1011 10:39:08.509625 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:08.509672 master-0 kubenswrapper[4790]: E1011 10:39:08.509650 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:08.509672 master-0 kubenswrapper[4790]: E1011 10:39:08.509667 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:08.509787 master-0 kubenswrapper[4790]: E1011 10:39:08.509754 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:16.509734084 +0000 UTC m=+33.064194376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:08.667204 master-0 kubenswrapper[4790]: I1011 10:39:08.667153 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-6-master-0"] Oct 11 10:39:08.667564 master-0 kubenswrapper[4790]: I1011 10:39:08.667545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.667680 master-0 kubenswrapper[4790]: E1011 10:39:08.667645 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:08.812696 master-0 kubenswrapper[4790]: I1011 10:39:08.812262 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.812696 master-0 kubenswrapper[4790]: I1011 10:39:08.812699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.813075 master-0 kubenswrapper[4790]: I1011 10:39:08.812763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-var-lock\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.914012 master-0 kubenswrapper[4790]: I1011 10:39:08.913894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-var-lock\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.914012 master-0 kubenswrapper[4790]: I1011 10:39:08.914017 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.914349 master-0 kubenswrapper[4790]: I1011 10:39:08.914071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.914349 master-0 kubenswrapper[4790]: I1011 10:39:08.914105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.914349 master-0 kubenswrapper[4790]: I1011 10:39:08.914065 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-var-lock\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:08.940359 master-0 kubenswrapper[4790]: E1011 10:39:08.940153 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:08.940359 master-0 kubenswrapper[4790]: E1011 10:39:08.940207 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-6-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:08.940359 master-0 kubenswrapper[4790]: E1011 10:39:08.940283 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access podName:50029f27-4009-4075-b148-02f232416a57 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:09.440257098 +0000 UTC m=+25.994717390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access") pod "installer-6-master-0" (UID: "50029f27-4009-4075-b148-02f232416a57") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:09.161211 master-0 kubenswrapper[4790]: I1011 10:39:09.161148 4790 ???:1] "http: TLS handshake error from 192.168.34.11:38028: no serving certificate available for the kubelet" Oct 11 10:39:09.520328 master-0 kubenswrapper[4790]: I1011 10:39:09.520266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:09.521081 master-0 kubenswrapper[4790]: E1011 10:39:09.520459 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:09.521081 master-0 kubenswrapper[4790]: E1011 10:39:09.520480 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-6-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:09.521081 master-0 kubenswrapper[4790]: E1011 10:39:09.520533 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access podName:50029f27-4009-4075-b148-02f232416a57 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:10.520516784 +0000 UTC m=+27.074977076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access") pod "installer-6-master-0" (UID: "50029f27-4009-4075-b148-02f232416a57") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:10.292566 master-0 kubenswrapper[4790]: I1011 10:39:10.292435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:10.292566 master-0 kubenswrapper[4790]: I1011 10:39:10.292435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:10.293042 master-0 kubenswrapper[4790]: E1011 10:39:10.292601 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:10.293042 master-0 kubenswrapper[4790]: E1011 10:39:10.292748 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:10.293042 master-0 kubenswrapper[4790]: I1011 10:39:10.292458 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:10.293239 master-0 kubenswrapper[4790]: E1011 10:39:10.293139 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:10.529760 master-0 kubenswrapper[4790]: I1011 10:39:10.529588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:10.529760 master-0 kubenswrapper[4790]: E1011 10:39:10.529782 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:10.529760 master-0 kubenswrapper[4790]: E1011 10:39:10.529802 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-6-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:10.530998 master-0 kubenswrapper[4790]: E1011 10:39:10.529847 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access podName:50029f27-4009-4075-b148-02f232416a57 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:12.529829316 +0000 UTC m=+29.084289608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access") pod "installer-6-master-0" (UID: "50029f27-4009-4075-b148-02f232416a57") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:12.292157 master-0 kubenswrapper[4790]: I1011 10:39:12.292009 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:12.292157 master-0 kubenswrapper[4790]: I1011 10:39:12.292082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:12.293365 master-0 kubenswrapper[4790]: I1011 10:39:12.292008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:12.293365 master-0 kubenswrapper[4790]: E1011 10:39:12.292284 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:12.293365 master-0 kubenswrapper[4790]: E1011 10:39:12.292391 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:12.293365 master-0 kubenswrapper[4790]: E1011 10:39:12.292508 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:12.302319 master-0 kubenswrapper[4790]: I1011 10:39:12.302253 4790 ???:1] "http: TLS handshake error from 192.168.34.12:45952: no serving certificate available for the kubelet" Oct 11 10:39:12.550024 master-0 kubenswrapper[4790]: I1011 10:39:12.549938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:12.550414 master-0 kubenswrapper[4790]: E1011 10:39:12.550305 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:12.550414 master-0 kubenswrapper[4790]: E1011 10:39:12.550396 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-6-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:12.550604 master-0 kubenswrapper[4790]: E1011 10:39:12.550547 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access podName:50029f27-4009-4075-b148-02f232416a57 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:16.550487973 +0000 UTC m=+33.104948305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access") pod "installer-6-master-0" (UID: "50029f27-4009-4075-b148-02f232416a57") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:14.292367 master-0 kubenswrapper[4790]: I1011 10:39:14.292258 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:14.293217 master-0 kubenswrapper[4790]: I1011 10:39:14.292437 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:14.296070 master-0 kubenswrapper[4790]: E1011 10:39:14.295649 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:14.298185 master-0 kubenswrapper[4790]: I1011 10:39:14.296605 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:14.298185 master-0 kubenswrapper[4790]: E1011 10:39:14.296741 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:14.298185 master-0 kubenswrapper[4790]: E1011 10:39:14.296888 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:16.291956 master-0 kubenswrapper[4790]: I1011 10:39:16.291825 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:16.291956 master-0 kubenswrapper[4790]: I1011 10:39:16.291905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:16.291956 master-0 kubenswrapper[4790]: I1011 10:39:16.291934 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:16.292869 master-0 kubenswrapper[4790]: E1011 10:39:16.291978 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:16.292869 master-0 kubenswrapper[4790]: E1011 10:39:16.292076 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:16.292869 master-0 kubenswrapper[4790]: E1011 10:39:16.292171 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:16.492122 master-0 kubenswrapper[4790]: I1011 10:39:16.492031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:16.492406 master-0 kubenswrapper[4790]: E1011 10:39:16.492213 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:16.492406 master-0 kubenswrapper[4790]: E1011 10:39:16.492301 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:32.492284221 +0000 UTC m=+49.046744513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:16.593487 master-0 kubenswrapper[4790]: I1011 10:39:16.593382 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:16.593487 master-0 kubenswrapper[4790]: I1011 10:39:16.593476 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593639 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593641 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593685 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-6-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593661 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593743 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593786 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access podName:50029f27-4009-4075-b148-02f232416a57 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:24.593758607 +0000 UTC m=+41.148218919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access") pod "installer-6-master-0" (UID: "50029f27-4009-4075-b148-02f232416a57") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:16.593823 master-0 kubenswrapper[4790]: E1011 10:39:16.593812 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:32.593801088 +0000 UTC m=+49.148261400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:17.758123 master-0 kubenswrapper[4790]: I1011 10:39:17.757991 4790 csr.go:261] certificate signing request csr-j98l9 is approved, waiting to be issued Oct 11 10:39:17.769985 master-0 kubenswrapper[4790]: I1011 10:39:17.769906 4790 csr.go:257] certificate signing request csr-j98l9 is issued Oct 11 10:39:18.303252 master-0 kubenswrapper[4790]: I1011 10:39:18.302685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:18.303252 master-0 kubenswrapper[4790]: I1011 10:39:18.303256 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:18.303616 master-0 kubenswrapper[4790]: I1011 10:39:18.303357 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:18.303616 master-0 kubenswrapper[4790]: E1011 10:39:18.303527 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:18.303818 master-0 kubenswrapper[4790]: E1011 10:39:18.303728 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:18.304031 master-0 kubenswrapper[4790]: E1011 10:39:18.303951 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:18.772161 master-0 kubenswrapper[4790]: I1011 10:39:18.772041 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-10-12 10:21:12 +0000 UTC, rotation deadline is 2025-10-12 05:34:38.01693959 +0000 UTC Oct 11 10:39:18.772161 master-0 kubenswrapper[4790]: I1011 10:39:18.772097 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h55m19.244846149s for next certificate rotation Oct 11 10:39:19.772806 master-0 kubenswrapper[4790]: I1011 10:39:19.772748 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-10-12 10:21:12 +0000 UTC, rotation deadline is 2025-10-12 07:30:56.064650789 +0000 UTC Oct 11 10:39:19.772806 master-0 kubenswrapper[4790]: I1011 10:39:19.772791 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h51m36.291862288s for next certificate rotation Oct 11 10:39:20.292580 master-0 kubenswrapper[4790]: I1011 10:39:20.292463 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:20.292915 master-0 kubenswrapper[4790]: E1011 10:39:20.292759 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:20.293579 master-0 kubenswrapper[4790]: I1011 10:39:20.293533 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:20.293786 master-0 kubenswrapper[4790]: E1011 10:39:20.293700 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:20.293956 master-0 kubenswrapper[4790]: I1011 10:39:20.293910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:20.294103 master-0 kubenswrapper[4790]: E1011 10:39:20.294054 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:22.314529 master-0 kubenswrapper[4790]: I1011 10:39:22.314471 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:22.315266 master-0 kubenswrapper[4790]: I1011 10:39:22.314638 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:22.315266 master-0 kubenswrapper[4790]: I1011 10:39:22.314836 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:22.315266 master-0 kubenswrapper[4790]: E1011 10:39:22.314856 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:22.315266 master-0 kubenswrapper[4790]: E1011 10:39:22.314948 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:22.315266 master-0 kubenswrapper[4790]: E1011 10:39:22.315036 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:24.293372 master-0 kubenswrapper[4790]: I1011 10:39:24.292660 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:24.293372 master-0 kubenswrapper[4790]: I1011 10:39:24.293314 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:24.294363 master-0 kubenswrapper[4790]: E1011 10:39:24.293520 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:24.294363 master-0 kubenswrapper[4790]: I1011 10:39:24.293878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:24.294447 master-0 kubenswrapper[4790]: E1011 10:39:24.294402 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:24.294615 master-0 kubenswrapper[4790]: E1011 10:39:24.294579 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:24.461415 master-0 kubenswrapper[4790]: I1011 10:39:24.460311 4790 generic.go:334] "Generic (PLEG): container finished" podID="417d5cfd-0cf3-4d96-b901-fcfe4f742ca5" containerID="8d6c1b823de6d3bbb1ce290ecfdd81097a24f1a6b64ac3d1baa8dbfab78727e3" exitCode=0 Oct 11 10:39:24.461415 master-0 kubenswrapper[4790]: I1011 10:39:24.460972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerDied","Data":"8d6c1b823de6d3bbb1ce290ecfdd81097a24f1a6b64ac3d1baa8dbfab78727e3"} Oct 11 10:39:24.463183 master-0 kubenswrapper[4790]: I1011 10:39:24.463167 4790 generic.go:334] "Generic (PLEG): container finished" podID="24d4b452-8f49-4e9e-98b6-3429afefc4c4" containerID="cb1b25e322d42bbe15cffe1d3217152fa186cf0eca7d74f0dcb251ca7411c341" exitCode=0 Oct 11 10:39:24.463657 master-0 kubenswrapper[4790]: I1011 10:39:24.463316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerDied","Data":"cb1b25e322d42bbe15cffe1d3217152fa186cf0eca7d74f0dcb251ca7411c341"} Oct 11 10:39:24.465224 master-0 kubenswrapper[4790]: I1011 10:39:24.465199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r499q" event={"ID":"8c1c727b-713a-4dff-ae8b-ad9b9851adae","Type":"ContainerStarted","Data":"7954596edbe6a6aeecda34dd5fce3bda1928053feaf02e194e6f5c3aedc1471a"} Oct 11 10:39:24.466987 master-0 kubenswrapper[4790]: I1011 10:39:24.466643 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-85bvx" event={"ID":"bfe05233-94bf-4e16-8c7e-321435ba7f00","Type":"ContainerStarted","Data":"c7885915bf1943aca7a37762abd568286448906e5423ad01a0c6735e8a9ffab6"} Oct 11 10:39:24.469363 master-0 kubenswrapper[4790]: I1011 10:39:24.468573 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_117b8efe269c98124cf5022ab3c340a5/kube-rbac-proxy-crio/1.log" Oct 11 10:39:24.482477 master-0 kubenswrapper[4790]: I1011 10:39:24.482426 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"117b8efe269c98124cf5022ab3c340a5","Type":"ContainerStarted","Data":"27b1d02a9c060f9f7b751a24b5b4858a6e202d522b4e5837cb0be6cbd788c231"} Oct 11 10:39:24.488525 master-0 kubenswrapper[4790]: I1011 10:39:24.488492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l66k2" event={"ID":"7d9f4c3d-57bd-49f6-94f2-47670b385318","Type":"ContainerStarted","Data":"6b183fb3917d41ad1a8552e7885aa4b5b49499993b0af87d458e1c7ff3f4620c"} Oct 11 10:39:24.488615 master-0 kubenswrapper[4790]: I1011 10:39:24.488601 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l66k2" event={"ID":"7d9f4c3d-57bd-49f6-94f2-47670b385318","Type":"ContainerStarted","Data":"d9b098d87397c6534971baf6cd9a23d22ce280cdf7aa79fbcfcf04a94fdb3c37"} Oct 11 10:39:24.490026 master-0 kubenswrapper[4790]: I1011 10:39:24.489975 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5kghv" event={"ID":"00e9cb61-65c4-4e6a-bb0c-2428529c63bf","Type":"ContainerStarted","Data":"ba7a48c8c170f0539b9626753f16469e71298d5b1ce649847a842c1bd11e5612"} Oct 11 10:39:24.492670 master-0 kubenswrapper[4790]: I1011 10:39:24.492631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g99cx" event={"ID":"4e2d32e6-3363-4389-ad6a-cfd917e568d2","Type":"ContainerStarted","Data":"d71e6c5741d1252fb04a794dd00c47f3f9910b893f74e8a5143da2763dcedf64"} Oct 11 10:39:24.495430 master-0 kubenswrapper[4790]: I1011 10:39:24.495389 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kh4ld" event={"ID":"0a2f91f6-f87a-4b69-a47a-91ca827d8386","Type":"ContainerStarted","Data":"b4b76686cfa1337380eb37f9edf14704fb60d927e8f7fdb2c130cf4fe2f40ff0"} Oct 11 10:39:24.495430 master-0 kubenswrapper[4790]: I1011 10:39:24.495426 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kh4ld" event={"ID":"0a2f91f6-f87a-4b69-a47a-91ca827d8386","Type":"ContainerStarted","Data":"311126a080ca6dd36b989aad9f05139f8f993501c6941fce3a509ded5c7edd89"} Oct 11 10:39:24.535256 master-0 kubenswrapper[4790]: I1011 10:39:24.535140 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r499q" podStartSLOduration=11.418270635 podStartE2EDuration="34.535116555s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:39:00.961636471 +0000 UTC m=+17.516096763" lastFinishedPulling="2025-10-11 10:39:24.078482391 +0000 UTC m=+40.632942683" observedRunningTime="2025-10-11 10:39:24.533910784 +0000 UTC m=+41.088371096" watchObservedRunningTime="2025-10-11 10:39:24.535116555 +0000 UTC m=+41.089576847" Oct 11 10:39:24.555544 master-0 kubenswrapper[4790]: I1011 10:39:24.555435 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-kh4ld" podStartSLOduration=10.436692239 podStartE2EDuration="33.555410896s" podCreationTimestamp="2025-10-11 10:38:51 +0000 UTC" firstStartedPulling="2025-10-11 10:39:00.998629782 +0000 UTC m=+17.553090074" lastFinishedPulling="2025-10-11 10:39:24.117348439 +0000 UTC m=+40.671808731" observedRunningTime="2025-10-11 10:39:24.554946954 +0000 UTC m=+41.109407266" watchObservedRunningTime="2025-10-11 10:39:24.555410896 +0000 UTC m=+41.109871188" Oct 11 10:39:24.576647 master-0 kubenswrapper[4790]: I1011 10:39:24.576575 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=33.576551178 podStartE2EDuration="33.576551178s" podCreationTimestamp="2025-10-11 10:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:39:24.576349933 +0000 UTC m=+41.130810235" watchObservedRunningTime="2025-10-11 10:39:24.576551178 +0000 UTC m=+41.131011470" Oct 11 10:39:24.651821 master-0 kubenswrapper[4790]: I1011 10:39:24.650803 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g99cx" podStartSLOduration=11.703467748 podStartE2EDuration="34.650780335s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:39:01.038252109 +0000 UTC m=+17.592712431" lastFinishedPulling="2025-10-11 10:39:23.985564696 +0000 UTC m=+40.540025018" observedRunningTime="2025-10-11 10:39:24.65060592 +0000 UTC m=+41.205066222" watchObservedRunningTime="2025-10-11 10:39:24.650780335 +0000 UTC m=+41.205240707" Oct 11 10:39:24.687981 master-0 kubenswrapper[4790]: I1011 10:39:24.687863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") pod \"installer-6-master-0\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:24.688249 master-0 kubenswrapper[4790]: E1011 10:39:24.688199 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:24.688303 master-0 kubenswrapper[4790]: E1011 10:39:24.688271 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-6-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:24.688434 master-0 kubenswrapper[4790]: E1011 10:39:24.688399 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access podName:50029f27-4009-4075-b148-02f232416a57 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:40.688358459 +0000 UTC m=+57.242818791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access") pod "installer-6-master-0" (UID: "50029f27-4009-4075-b148-02f232416a57") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:24.715906 master-0 kubenswrapper[4790]: I1011 10:39:24.715675 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-l66k2" podStartSLOduration=33.48788785 podStartE2EDuration="34.71563506s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:39:01.038249859 +0000 UTC m=+17.592710191" lastFinishedPulling="2025-10-11 10:39:02.265997109 +0000 UTC m=+18.820457401" observedRunningTime="2025-10-11 10:39:24.713486774 +0000 UTC m=+41.267947116" watchObservedRunningTime="2025-10-11 10:39:24.71563506 +0000 UTC m=+41.270095392" Oct 11 10:39:24.748021 master-0 kubenswrapper[4790]: I1011 10:39:24.747655 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5kghv" podStartSLOduration=11.705252622 podStartE2EDuration="34.74761501s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:39:00.902475853 +0000 UTC m=+17.456936185" lastFinishedPulling="2025-10-11 10:39:23.944838241 +0000 UTC m=+40.499298573" observedRunningTime="2025-10-11 10:39:24.746557573 +0000 UTC m=+41.301017915" watchObservedRunningTime="2025-10-11 10:39:24.74761501 +0000 UTC m=+41.302075342" Oct 11 10:39:24.782914 master-0 kubenswrapper[4790]: I1011 10:39:24.782810 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-85bvx" podStartSLOduration=11.715509817000001 podStartE2EDuration="34.782784664s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:39:00.92146286 +0000 UTC m=+17.475923162" lastFinishedPulling="2025-10-11 10:39:23.988737687 +0000 UTC m=+40.543198009" observedRunningTime="2025-10-11 10:39:24.782616119 +0000 UTC m=+41.337076411" watchObservedRunningTime="2025-10-11 10:39:24.782784664 +0000 UTC m=+41.337244976" Oct 11 10:39:25.508121 master-0 kubenswrapper[4790]: I1011 10:39:25.508033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"b5faae4cb3ce806047cd66c065a54f6c8cc6b120d3d6c1a930b8eb04fb788f18"} Oct 11 10:39:25.508121 master-0 kubenswrapper[4790]: I1011 10:39:25.508117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"7355ba655a327634066827f4e80f5fe8032e43bdabdd01970c30815cb9d86537"} Oct 11 10:39:25.510144 master-0 kubenswrapper[4790]: I1011 10:39:25.508157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"ab19d5c0142bc874df4b98658c457d1cdc054f9b46eef50595af10649131145b"} Oct 11 10:39:25.510144 master-0 kubenswrapper[4790]: I1011 10:39:25.508178 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"0953b5b0c5d6edfebd7e041d85e453f7d46a7e288f2dbe6db61c650e49aa3ec0"} Oct 11 10:39:25.510144 master-0 kubenswrapper[4790]: I1011 10:39:25.508197 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"a66d1fdbb33d748a1a06a36bf1348949b781c536249b656184762e926e180206"} Oct 11 10:39:25.510144 master-0 kubenswrapper[4790]: I1011 10:39:25.508215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"1c19f9bbf921ae3af539ff0dff6e8cc4553b77a82249a509f0b4aa7f76a3e97f"} Oct 11 10:39:26.292841 master-0 kubenswrapper[4790]: I1011 10:39:26.292614 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:26.292841 master-0 kubenswrapper[4790]: I1011 10:39:26.292668 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:26.292841 master-0 kubenswrapper[4790]: I1011 10:39:26.292676 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:26.293216 master-0 kubenswrapper[4790]: E1011 10:39:26.292905 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-6-master-0" podUID="50029f27-4009-4075-b148-02f232416a57" Oct 11 10:39:26.293216 master-0 kubenswrapper[4790]: E1011 10:39:26.292811 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:26.293216 master-0 kubenswrapper[4790]: E1011 10:39:26.292982 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:27.521963 master-0 kubenswrapper[4790]: I1011 10:39:27.521662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"6a176c78e46a2ac85d5511e5328a902be27c1e6cbfc1e616a7087d989017fbb7"} Oct 11 10:39:28.265178 master-0 kubenswrapper[4790]: I1011 10:39:28.265049 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-6-master-0"] Oct 11 10:39:28.265495 master-0 kubenswrapper[4790]: I1011 10:39:28.265248 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:28.275214 master-0 kubenswrapper[4790]: I1011 10:39:28.275139 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:28.292453 master-0 kubenswrapper[4790]: I1011 10:39:28.292317 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:28.292613 master-0 kubenswrapper[4790]: I1011 10:39:28.292488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:28.292838 master-0 kubenswrapper[4790]: E1011 10:39:28.292754 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:28.292953 master-0 kubenswrapper[4790]: E1011 10:39:28.292859 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:28.425481 master-0 kubenswrapper[4790]: I1011 10:39:28.425394 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-kubelet-dir\") pod \"50029f27-4009-4075-b148-02f232416a57\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " Oct 11 10:39:28.425481 master-0 kubenswrapper[4790]: I1011 10:39:28.425468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-var-lock\") pod \"50029f27-4009-4075-b148-02f232416a57\" (UID: \"50029f27-4009-4075-b148-02f232416a57\") " Oct 11 10:39:28.425801 master-0 kubenswrapper[4790]: I1011 10:39:28.425537 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "50029f27-4009-4075-b148-02f232416a57" (UID: "50029f27-4009-4075-b148-02f232416a57"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:39:28.425801 master-0 kubenswrapper[4790]: I1011 10:39:28.425597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-var-lock" (OuterVolumeSpecName: "var-lock") pod "50029f27-4009-4075-b148-02f232416a57" (UID: "50029f27-4009-4075-b148-02f232416a57"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:39:28.425801 master-0 kubenswrapper[4790]: I1011 10:39:28.425661 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:28.425801 master-0 kubenswrapper[4790]: I1011 10:39:28.425674 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/50029f27-4009-4075-b148-02f232416a57-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:28.524800 master-0 kubenswrapper[4790]: I1011 10:39:28.524569 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-6-master-0" Oct 11 10:39:28.577082 master-0 kubenswrapper[4790]: I1011 10:39:28.576990 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-6-master-0"] Oct 11 10:39:28.587449 master-0 kubenswrapper[4790]: I1011 10:39:28.587356 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-6-master-0"] Oct 11 10:39:28.728472 master-0 kubenswrapper[4790]: I1011 10:39:28.728426 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50029f27-4009-4075-b148-02f232416a57-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:30.292829 master-0 kubenswrapper[4790]: I1011 10:39:30.292276 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:30.292829 master-0 kubenswrapper[4790]: I1011 10:39:30.292334 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:30.293861 master-0 kubenswrapper[4790]: E1011 10:39:30.292846 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:30.293861 master-0 kubenswrapper[4790]: E1011 10:39:30.292963 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:30.535728 master-0 kubenswrapper[4790]: I1011 10:39:30.534966 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" event={"ID":"417d5cfd-0cf3-4d96-b901-fcfe4f742ca5","Type":"ContainerStarted","Data":"2f0526028039267cde2979d801362373c8768640c849ad28a6187f6ce5f10f04"} Oct 11 10:39:30.535728 master-0 kubenswrapper[4790]: I1011 10:39:30.535236 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:30.535728 master-0 kubenswrapper[4790]: I1011 10:39:30.535556 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:30.992249 master-0 kubenswrapper[4790]: I1011 10:39:30.992152 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:32.268803 master-0 kubenswrapper[4790]: I1011 10:39:32.267961 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" podStartSLOduration=18.158643306 podStartE2EDuration="41.267925783s" podCreationTimestamp="2025-10-11 10:38:51 +0000 UTC" firstStartedPulling="2025-10-11 10:39:01.011493971 +0000 UTC m=+17.565954263" lastFinishedPulling="2025-10-11 10:39:24.120776438 +0000 UTC m=+40.675236740" observedRunningTime="2025-10-11 10:39:30.58899739 +0000 UTC m=+47.143457722" watchObservedRunningTime="2025-10-11 10:39:32.267925783 +0000 UTC m=+48.822386115" Oct 11 10:39:32.268803 master-0 kubenswrapper[4790]: I1011 10:39:32.268640 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-7-master-0"] Oct 11 10:39:32.269630 master-0 kubenswrapper[4790]: I1011 10:39:32.269147 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.269630 master-0 kubenswrapper[4790]: E1011 10:39:32.269234 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-7-master-0" podUID="7153671c-589d-434b-88b4-36e3f0e3a585" Oct 11 10:39:32.292124 master-0 kubenswrapper[4790]: I1011 10:39:32.292056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:32.292418 master-0 kubenswrapper[4790]: E1011 10:39:32.292349 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:32.293220 master-0 kubenswrapper[4790]: I1011 10:39:32.293164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:32.295646 master-0 kubenswrapper[4790]: E1011 10:39:32.295365 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:32.362568 master-0 kubenswrapper[4790]: I1011 10:39:32.362462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.362568 master-0 kubenswrapper[4790]: I1011 10:39:32.362546 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-var-lock\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.362860 master-0 kubenswrapper[4790]: I1011 10:39:32.362620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.463290 master-0 kubenswrapper[4790]: I1011 10:39:32.463120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.463290 master-0 kubenswrapper[4790]: I1011 10:39:32.463203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.463290 master-0 kubenswrapper[4790]: I1011 10:39:32.463251 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-var-lock\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.463290 master-0 kubenswrapper[4790]: I1011 10:39:32.463321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-var-lock\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.463836 master-0 kubenswrapper[4790]: I1011 10:39:32.463798 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:32.496970 master-0 kubenswrapper[4790]: E1011 10:39:32.496902 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:32.496970 master-0 kubenswrapper[4790]: E1011 10:39:32.496947 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-7-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:32.497273 master-0 kubenswrapper[4790]: E1011 10:39:32.497022 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access podName:7153671c-589d-434b-88b4-36e3f0e3a585 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:32.996997935 +0000 UTC m=+49.551458227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access") pod "installer-7-master-0" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:32.587966 master-0 kubenswrapper[4790]: I1011 10:39:32.587877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:32.588641 master-0 kubenswrapper[4790]: E1011 10:39:32.588581 4790 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:32.588913 master-0 kubenswrapper[4790]: E1011 10:39:32.588893 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs podName:a5b695d5-a88c-4ff9-bc59-d13f61f237f6 nodeName:}" failed. No retries permitted until 2025-10-11 10:40:04.588853252 +0000 UTC m=+81.143313564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs") pod "network-metrics-daemon-zcc4t" (UID: "a5b695d5-a88c-4ff9-bc59-d13f61f237f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 10:39:32.593465 master-0 kubenswrapper[4790]: I1011 10:39:32.593401 4790 generic.go:334] "Generic (PLEG): container finished" podID="24d4b452-8f49-4e9e-98b6-3429afefc4c4" containerID="60ae2bc66c768fd133fb33de0b797bb3ba3a737ec0842ecc5b07d80a61c2f2b0" exitCode=0 Oct 11 10:39:32.593619 master-0 kubenswrapper[4790]: I1011 10:39:32.593558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerDied","Data":"60ae2bc66c768fd133fb33de0b797bb3ba3a737ec0842ecc5b07d80a61c2f2b0"} Oct 11 10:39:32.689438 master-0 kubenswrapper[4790]: I1011 10:39:32.689334 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:32.689751 master-0 kubenswrapper[4790]: E1011 10:39:32.689541 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 10:39:32.689751 master-0 kubenswrapper[4790]: E1011 10:39:32.689560 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 10:39:32.689751 master-0 kubenswrapper[4790]: E1011 10:39:32.689571 4790 projected.go:194] Error preparing data for projected volume kube-api-access-8xlkt for pod openshift-network-diagnostics/network-check-target-bn2sv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:32.689980 master-0 kubenswrapper[4790]: E1011 10:39:32.689804 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt podName:0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7 nodeName:}" failed. No retries permitted until 2025-10-11 10:40:04.689776093 +0000 UTC m=+81.244236405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xlkt" (UniqueName: "kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt") pod "network-check-target-bn2sv" (UID: "0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 10:39:33.092037 master-0 kubenswrapper[4790]: I1011 10:39:33.091944 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:33.092417 master-0 kubenswrapper[4790]: E1011 10:39:33.092132 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:33.092417 master-0 kubenswrapper[4790]: E1011 10:39:33.092152 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-7-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:33.092417 master-0 kubenswrapper[4790]: E1011 10:39:33.092207 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access podName:7153671c-589d-434b-88b4-36e3f0e3a585 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:34.092188485 +0000 UTC m=+50.646648777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access") pod "installer-7-master-0" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:33.292123 master-0 kubenswrapper[4790]: I1011 10:39:33.291981 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:33.292925 master-0 kubenswrapper[4790]: E1011 10:39:33.292166 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-7-master-0" podUID="7153671c-589d-434b-88b4-36e3f0e3a585" Oct 11 10:39:34.100224 master-0 kubenswrapper[4790]: I1011 10:39:34.099976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:34.100489 master-0 kubenswrapper[4790]: E1011 10:39:34.100249 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:34.100489 master-0 kubenswrapper[4790]: E1011 10:39:34.100275 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-7-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:34.100489 master-0 kubenswrapper[4790]: E1011 10:39:34.100337 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access podName:7153671c-589d-434b-88b4-36e3f0e3a585 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:36.100316026 +0000 UTC m=+52.654776308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access") pod "installer-7-master-0" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:34.292597 master-0 kubenswrapper[4790]: I1011 10:39:34.292452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:34.292597 master-0 kubenswrapper[4790]: I1011 10:39:34.292540 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:34.293890 master-0 kubenswrapper[4790]: E1011 10:39:34.293824 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:34.293981 master-0 kubenswrapper[4790]: E1011 10:39:34.293942 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:34.601682 master-0 kubenswrapper[4790]: I1011 10:39:34.601564 4790 generic.go:334] "Generic (PLEG): container finished" podID="24d4b452-8f49-4e9e-98b6-3429afefc4c4" containerID="577dbabbd1dcda298e8312f0abf41bca5da23d3e379e323dc62243a5ec9eb24c" exitCode=0 Oct 11 10:39:34.601682 master-0 kubenswrapper[4790]: I1011 10:39:34.601640 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerDied","Data":"577dbabbd1dcda298e8312f0abf41bca5da23d3e379e323dc62243a5ec9eb24c"} Oct 11 10:39:35.292664 master-0 kubenswrapper[4790]: I1011 10:39:35.292507 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:35.293435 master-0 kubenswrapper[4790]: E1011 10:39:35.292809 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-7-master-0" podUID="7153671c-589d-434b-88b4-36e3f0e3a585" Oct 11 10:39:36.117448 master-0 kubenswrapper[4790]: I1011 10:39:36.117308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:36.117782 master-0 kubenswrapper[4790]: E1011 10:39:36.117548 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:36.117782 master-0 kubenswrapper[4790]: E1011 10:39:36.117596 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-7-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:36.117782 master-0 kubenswrapper[4790]: E1011 10:39:36.117688 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access podName:7153671c-589d-434b-88b4-36e3f0e3a585 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:40.117658368 +0000 UTC m=+56.672118850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access") pod "installer-7-master-0" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:36.292038 master-0 kubenswrapper[4790]: I1011 10:39:36.291922 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:36.292038 master-0 kubenswrapper[4790]: I1011 10:39:36.291994 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:36.292502 master-0 kubenswrapper[4790]: E1011 10:39:36.292129 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:36.292502 master-0 kubenswrapper[4790]: E1011 10:39:36.292296 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:36.613600 master-0 kubenswrapper[4790]: I1011 10:39:36.613482 4790 generic.go:334] "Generic (PLEG): container finished" podID="24d4b452-8f49-4e9e-98b6-3429afefc4c4" containerID="913e1a8c1cb851c82ef33685641673d1d715c6fa27efd620af4c66cb97b43d12" exitCode=0 Oct 11 10:39:36.613600 master-0 kubenswrapper[4790]: I1011 10:39:36.613580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerDied","Data":"913e1a8c1cb851c82ef33685641673d1d715c6fa27efd620af4c66cb97b43d12"} Oct 11 10:39:37.292074 master-0 kubenswrapper[4790]: I1011 10:39:37.291972 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:37.292391 master-0 kubenswrapper[4790]: E1011 10:39:37.292215 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-7-master-0" podUID="7153671c-589d-434b-88b4-36e3f0e3a585" Oct 11 10:39:38.291884 master-0 kubenswrapper[4790]: I1011 10:39:38.291454 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:38.292469 master-0 kubenswrapper[4790]: I1011 10:39:38.291454 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:38.292469 master-0 kubenswrapper[4790]: E1011 10:39:38.291954 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:38.292469 master-0 kubenswrapper[4790]: E1011 10:39:38.292119 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:39.291453 master-0 kubenswrapper[4790]: I1011 10:39:39.291388 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:39.291883 master-0 kubenswrapper[4790]: E1011 10:39:39.291526 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-7-master-0" podUID="7153671c-589d-434b-88b4-36e3f0e3a585" Oct 11 10:39:40.169369 master-0 kubenswrapper[4790]: I1011 10:39:40.169292 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:40.170126 master-0 kubenswrapper[4790]: E1011 10:39:40.169570 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:40.170126 master-0 kubenswrapper[4790]: E1011 10:39:40.169616 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-7-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:40.170126 master-0 kubenswrapper[4790]: E1011 10:39:40.169696 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access podName:7153671c-589d-434b-88b4-36e3f0e3a585 nodeName:}" failed. No retries permitted until 2025-10-11 10:39:48.169673815 +0000 UTC m=+64.724134107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access") pod "installer-7-master-0" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:40.292366 master-0 kubenswrapper[4790]: I1011 10:39:40.292276 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:40.292366 master-0 kubenswrapper[4790]: I1011 10:39:40.292351 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:40.292646 master-0 kubenswrapper[4790]: E1011 10:39:40.292452 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:40.292646 master-0 kubenswrapper[4790]: E1011 10:39:40.292567 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:40.670970 master-0 kubenswrapper[4790]: I1011 10:39:40.670897 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-7-master-0"] Oct 11 10:39:40.671221 master-0 kubenswrapper[4790]: I1011 10:39:40.671035 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:40.677875 master-0 kubenswrapper[4790]: I1011 10:39:40.677853 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:40.874959 master-0 kubenswrapper[4790]: I1011 10:39:40.874855 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-var-lock\") pod \"7153671c-589d-434b-88b4-36e3f0e3a585\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " Oct 11 10:39:40.874959 master-0 kubenswrapper[4790]: I1011 10:39:40.874914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-kubelet-dir\") pod \"7153671c-589d-434b-88b4-36e3f0e3a585\" (UID: \"7153671c-589d-434b-88b4-36e3f0e3a585\") " Oct 11 10:39:40.875313 master-0 kubenswrapper[4790]: I1011 10:39:40.874999 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-var-lock" (OuterVolumeSpecName: "var-lock") pod "7153671c-589d-434b-88b4-36e3f0e3a585" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:39:40.875313 master-0 kubenswrapper[4790]: I1011 10:39:40.875110 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7153671c-589d-434b-88b4-36e3f0e3a585" (UID: "7153671c-589d-434b-88b4-36e3f0e3a585"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:39:40.976030 master-0 kubenswrapper[4790]: I1011 10:39:40.975857 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:40.976030 master-0 kubenswrapper[4790]: I1011 10:39:40.975925 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7153671c-589d-434b-88b4-36e3f0e3a585-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:41.022045 master-0 kubenswrapper[4790]: I1011 10:39:41.021986 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:41.025607 master-0 kubenswrapper[4790]: I1011 10:39:41.025571 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:41.039877 master-0 kubenswrapper[4790]: I1011 10:39:41.039786 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" podUID="417d5cfd-0cf3-4d96-b901-fcfe4f742ca5" containerName="ovnkube-controller" probeResult="failure" output="" Oct 11 10:39:41.627833 master-0 kubenswrapper[4790]: I1011 10:39:41.627765 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-7-master-0" Oct 11 10:39:41.646568 master-0 kubenswrapper[4790]: I1011 10:39:41.646511 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" podUID="417d5cfd-0cf3-4d96-b901-fcfe4f742ca5" containerName="ovnkube-controller" probeResult="failure" output="" Oct 11 10:39:42.134463 master-0 kubenswrapper[4790]: I1011 10:39:42.134347 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-7-master-0"] Oct 11 10:39:42.212334 master-0 kubenswrapper[4790]: I1011 10:39:42.212288 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-7-master-0"] Oct 11 10:39:42.287405 master-0 kubenswrapper[4790]: I1011 10:39:42.287334 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7153671c-589d-434b-88b4-36e3f0e3a585-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:39:42.292494 master-0 kubenswrapper[4790]: I1011 10:39:42.292450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:42.292575 master-0 kubenswrapper[4790]: I1011 10:39:42.292545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:42.292761 master-0 kubenswrapper[4790]: E1011 10:39:42.292675 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:42.292902 master-0 kubenswrapper[4790]: E1011 10:39:42.292848 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:44.293295 master-0 kubenswrapper[4790]: I1011 10:39:44.293230 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:44.294431 master-0 kubenswrapper[4790]: I1011 10:39:44.293258 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:44.294990 master-0 kubenswrapper[4790]: E1011 10:39:44.294938 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:44.295142 master-0 kubenswrapper[4790]: E1011 10:39:44.295086 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:44.304349 master-0 kubenswrapper[4790]: I1011 10:39:44.304302 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-8-master-0"] Oct 11 10:39:44.304664 master-0 kubenswrapper[4790]: I1011 10:39:44.304631 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.304766 master-0 kubenswrapper[4790]: E1011 10:39:44.304731 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-8-master-0" podUID="a3934355-bb61-4316-b164-05294e12906a" Oct 11 10:39:44.405745 master-0 kubenswrapper[4790]: I1011 10:39:44.405592 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-kubelet-dir\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.405745 master-0 kubenswrapper[4790]: I1011 10:39:44.405648 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-var-lock\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.405745 master-0 kubenswrapper[4790]: I1011 10:39:44.405775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.506213 master-0 kubenswrapper[4790]: I1011 10:39:44.506081 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-var-lock\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.506213 master-0 kubenswrapper[4790]: I1011 10:39:44.506214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.506213 master-0 kubenswrapper[4790]: I1011 10:39:44.506221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-var-lock\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.506676 master-0 kubenswrapper[4790]: I1011 10:39:44.506640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-kubelet-dir\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.506776 master-0 kubenswrapper[4790]: I1011 10:39:44.506701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-kubelet-dir\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:44.576617 master-0 kubenswrapper[4790]: E1011 10:39:44.576543 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:44.576617 master-0 kubenswrapper[4790]: E1011 10:39:44.576585 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-8-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:44.576617 master-0 kubenswrapper[4790]: E1011 10:39:44.576644 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access podName:a3934355-bb61-4316-b164-05294e12906a nodeName:}" failed. No retries permitted until 2025-10-11 10:39:45.076620476 +0000 UTC m=+61.631080768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access") pod "installer-8-master-0" (UID: "a3934355-bb61-4316-b164-05294e12906a") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:45.110768 master-0 kubenswrapper[4790]: I1011 10:39:45.110568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:45.111122 master-0 kubenswrapper[4790]: E1011 10:39:45.110822 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:45.111122 master-0 kubenswrapper[4790]: E1011 10:39:45.110863 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-8-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:45.111122 master-0 kubenswrapper[4790]: E1011 10:39:45.110928 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access podName:a3934355-bb61-4316-b164-05294e12906a nodeName:}" failed. No retries permitted until 2025-10-11 10:39:46.110906122 +0000 UTC m=+62.665366414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access") pod "installer-8-master-0" (UID: "a3934355-bb61-4316-b164-05294e12906a") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:46.120033 master-0 kubenswrapper[4790]: I1011 10:39:46.119900 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:46.120831 master-0 kubenswrapper[4790]: E1011 10:39:46.120233 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:46.120831 master-0 kubenswrapper[4790]: E1011 10:39:46.120275 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-8-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:46.120831 master-0 kubenswrapper[4790]: E1011 10:39:46.120370 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access podName:a3934355-bb61-4316-b164-05294e12906a nodeName:}" failed. No retries permitted until 2025-10-11 10:39:48.120339578 +0000 UTC m=+64.674799910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access") pod "installer-8-master-0" (UID: "a3934355-bb61-4316-b164-05294e12906a") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:46.291969 master-0 kubenswrapper[4790]: I1011 10:39:46.291786 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:46.291969 master-0 kubenswrapper[4790]: I1011 10:39:46.291901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:46.292455 master-0 kubenswrapper[4790]: I1011 10:39:46.291808 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:46.292455 master-0 kubenswrapper[4790]: E1011 10:39:46.292104 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:46.292455 master-0 kubenswrapper[4790]: E1011 10:39:46.292167 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-8-master-0" podUID="a3934355-bb61-4316-b164-05294e12906a" Oct 11 10:39:46.292455 master-0 kubenswrapper[4790]: E1011 10:39:46.292267 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:47.652298 master-0 kubenswrapper[4790]: I1011 10:39:47.652243 4790 generic.go:334] "Generic (PLEG): container finished" podID="24d4b452-8f49-4e9e-98b6-3429afefc4c4" containerID="bc0358c40c75dc89bef20966f0a2851fd62d9b0845f2052ef84938a96fac4d83" exitCode=0 Oct 11 10:39:47.652842 master-0 kubenswrapper[4790]: I1011 10:39:47.652297 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerDied","Data":"bc0358c40c75dc89bef20966f0a2851fd62d9b0845f2052ef84938a96fac4d83"} Oct 11 10:39:48.139379 master-0 kubenswrapper[4790]: I1011 10:39:48.138813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:48.139379 master-0 kubenswrapper[4790]: E1011 10:39:48.139137 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:48.140015 master-0 kubenswrapper[4790]: E1011 10:39:48.139415 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-8-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:48.140015 master-0 kubenswrapper[4790]: E1011 10:39:48.139479 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access podName:a3934355-bb61-4316-b164-05294e12906a nodeName:}" failed. No retries permitted until 2025-10-11 10:39:52.139457594 +0000 UTC m=+68.693917886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access") pod "installer-8-master-0" (UID: "a3934355-bb61-4316-b164-05294e12906a") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:48.199604 master-0 kubenswrapper[4790]: I1011 10:39:48.198895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-8-master-0"] Oct 11 10:39:48.199604 master-0 kubenswrapper[4790]: I1011 10:39:48.199043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:48.199604 master-0 kubenswrapper[4790]: E1011 10:39:48.199153 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-8-master-0" podUID="a3934355-bb61-4316-b164-05294e12906a" Oct 11 10:39:48.204184 master-0 kubenswrapper[4790]: I1011 10:39:48.203811 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zcc4t"] Oct 11 10:39:48.204184 master-0 kubenswrapper[4790]: I1011 10:39:48.203872 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bn2sv"] Oct 11 10:39:48.204184 master-0 kubenswrapper[4790]: I1011 10:39:48.203967 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:48.204184 master-0 kubenswrapper[4790]: E1011 10:39:48.204044 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:48.204184 master-0 kubenswrapper[4790]: I1011 10:39:48.204106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:48.204184 master-0 kubenswrapper[4790]: E1011 10:39:48.204155 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:48.660971 master-0 kubenswrapper[4790]: I1011 10:39:48.660909 4790 generic.go:334] "Generic (PLEG): container finished" podID="24d4b452-8f49-4e9e-98b6-3429afefc4c4" containerID="a63f08ea91bf39979073d78bd84f5dd5d89bf91aa04b704a104dde5b04c85341" exitCode=0 Oct 11 10:39:48.661889 master-0 kubenswrapper[4790]: I1011 10:39:48.661017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerDied","Data":"a63f08ea91bf39979073d78bd84f5dd5d89bf91aa04b704a104dde5b04c85341"} Oct 11 10:39:49.292064 master-0 kubenswrapper[4790]: I1011 10:39:49.291887 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:49.292269 master-0 kubenswrapper[4790]: E1011 10:39:49.292103 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-8-master-0" podUID="a3934355-bb61-4316-b164-05294e12906a" Oct 11 10:39:49.674171 master-0 kubenswrapper[4790]: I1011 10:39:49.674040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" event={"ID":"24d4b452-8f49-4e9e-98b6-3429afefc4c4","Type":"ContainerStarted","Data":"37ac6c36a12e5c9753eb09f407c7e986b907eb87aa9519edd17357fa157ee20e"} Oct 11 10:39:50.292386 master-0 kubenswrapper[4790]: I1011 10:39:50.292261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:50.292772 master-0 kubenswrapper[4790]: I1011 10:39:50.292295 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:50.292772 master-0 kubenswrapper[4790]: E1011 10:39:50.292423 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:50.292772 master-0 kubenswrapper[4790]: E1011 10:39:50.292569 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:50.825332 master-0 kubenswrapper[4790]: I1011 10:39:50.825268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-96nq6" Oct 11 10:39:50.877276 master-0 kubenswrapper[4790]: I1011 10:39:50.877137 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ft6fv" podStartSLOduration=15.12860171 podStartE2EDuration="1m0.877114978s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:39:00.97209223 +0000 UTC m=+17.526552522" lastFinishedPulling="2025-10-11 10:39:46.720605498 +0000 UTC m=+63.275065790" observedRunningTime="2025-10-11 10:39:49.720948416 +0000 UTC m=+66.275408768" watchObservedRunningTime="2025-10-11 10:39:50.877114978 +0000 UTC m=+67.431575270" Oct 11 10:39:51.294821 master-0 kubenswrapper[4790]: I1011 10:39:51.294638 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:51.295137 master-0 kubenswrapper[4790]: E1011 10:39:51.294959 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-etcd/installer-8-master-0" podUID="a3934355-bb61-4316-b164-05294e12906a" Oct 11 10:39:52.187913 master-0 kubenswrapper[4790]: I1011 10:39:52.187828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:52.189181 master-0 kubenswrapper[4790]: E1011 10:39:52.188087 4790 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:52.189181 master-0 kubenswrapper[4790]: E1011 10:39:52.188149 4790 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-8-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:52.189181 master-0 kubenswrapper[4790]: E1011 10:39:52.188248 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access podName:a3934355-bb61-4316-b164-05294e12906a nodeName:}" failed. No retries permitted until 2025-10-11 10:40:00.188217569 +0000 UTC m=+76.742677891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access") pod "installer-8-master-0" (UID: "a3934355-bb61-4316-b164-05294e12906a") : object "openshift-etcd"/"kube-root-ca.crt" not registered Oct 11 10:39:52.291619 master-0 kubenswrapper[4790]: I1011 10:39:52.291504 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:52.291850 master-0 kubenswrapper[4790]: I1011 10:39:52.291623 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:52.291850 master-0 kubenswrapper[4790]: E1011 10:39:52.291725 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcc4t" podUID="a5b695d5-a88c-4ff9-bc59-d13f61f237f6" Oct 11 10:39:52.292182 master-0 kubenswrapper[4790]: E1011 10:39:52.291848 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bn2sv" podUID="0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7" Oct 11 10:39:52.856936 master-0 kubenswrapper[4790]: I1011 10:39:52.856839 4790 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Oct 11 10:39:52.857330 master-0 kubenswrapper[4790]: I1011 10:39:52.857097 4790 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 11 10:39:52.916135 master-0 kubenswrapper[4790]: I1011 10:39:52.915821 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-897b595f-xctr8"] Oct 11 10:39:52.916456 master-0 kubenswrapper[4790]: I1011 10:39:52.916400 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv"] Oct 11 10:39:52.916686 master-0 kubenswrapper[4790]: I1011 10:39:52.916609 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:52.917064 master-0 kubenswrapper[4790]: I1011 10:39:52.917017 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:52.921001 master-0 kubenswrapper[4790]: I1011 10:39:52.920956 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 11 10:39:52.921181 master-0 kubenswrapper[4790]: I1011 10:39:52.921148 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 11 10:39:52.921367 master-0 kubenswrapper[4790]: I1011 10:39:52.921338 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 11 10:39:52.921563 master-0 kubenswrapper[4790]: I1011 10:39:52.921537 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-z46lz" Oct 11 10:39:52.921662 master-0 kubenswrapper[4790]: I1011 10:39:52.921645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 11 10:39:52.922151 master-0 kubenswrapper[4790]: I1011 10:39:52.922117 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 11 10:39:52.922511 master-0 kubenswrapper[4790]: I1011 10:39:52.922492 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vwjkz" Oct 11 10:39:52.923137 master-0 kubenswrapper[4790]: I1011 10:39:52.923092 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 11 10:39:52.923270 master-0 kubenswrapper[4790]: I1011 10:39:52.923252 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 11 10:39:52.923530 master-0 kubenswrapper[4790]: I1011 10:39:52.923513 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 11 10:39:52.923664 master-0 kubenswrapper[4790]: I1011 10:39:52.923623 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 11 10:39:52.923857 master-0 kubenswrapper[4790]: I1011 10:39:52.923839 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 11 10:39:52.935619 master-0 kubenswrapper[4790]: I1011 10:39:52.935544 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 11 10:39:52.950274 master-0 kubenswrapper[4790]: I1011 10:39:52.950133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-897b595f-xctr8"] Oct 11 10:39:52.952967 master-0 kubenswrapper[4790]: I1011 10:39:52.952868 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv"] Oct 11 10:39:52.990489 master-0 kubenswrapper[4790]: I1011 10:39:52.990402 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6xnjz"] Oct 11 10:39:52.991617 master-0 kubenswrapper[4790]: I1011 10:39:52.991544 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:52.996506 master-0 kubenswrapper[4790]: I1011 10:39:52.996455 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-kcxf6" Oct 11 10:39:52.997145 master-0 kubenswrapper[4790]: I1011 10:39:52.997057 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 11 10:39:52.997950 master-0 kubenswrapper[4790]: I1011 10:39:52.997919 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 11 10:39:52.998154 master-0 kubenswrapper[4790]: I1011 10:39:52.998079 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 11 10:39:53.014953 master-0 kubenswrapper[4790]: I1011 10:39:53.014860 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6xnjz"] Oct 11 10:39:53.040571 master-0 kubenswrapper[4790]: I1011 10:39:53.040503 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xznwp"] Oct 11 10:39:53.041355 master-0 kubenswrapper[4790]: I1011 10:39:53.041313 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.044540 master-0 kubenswrapper[4790]: I1011 10:39:53.044414 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-dqfsj"] Oct 11 10:39:53.045237 master-0 kubenswrapper[4790]: I1011 10:39:53.044761 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.047688 master-0 kubenswrapper[4790]: I1011 10:39:53.047645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-psm75" Oct 11 10:39:53.048859 master-0 kubenswrapper[4790]: I1011 10:39:53.048834 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 11 10:39:53.049241 master-0 kubenswrapper[4790]: I1011 10:39:53.049176 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 11 10:39:53.049761 master-0 kubenswrapper[4790]: I1011 10:39:53.049734 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"iptables-alerter-dockercfg-tndgd" Oct 11 10:39:53.049962 master-0 kubenswrapper[4790]: I1011 10:39:53.049861 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 11 10:39:53.049962 master-0 kubenswrapper[4790]: I1011 10:39:53.049902 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 11 10:39:53.050211 master-0 kubenswrapper[4790]: I1011 10:39:53.050173 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 11 10:39:53.059221 master-0 kubenswrapper[4790]: I1011 10:39:53.059128 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xznwp"] Oct 11 10:39:53.096490 master-0 kubenswrapper[4790]: I1011 10:39:53.096419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-config\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.096702 master-0 kubenswrapper[4790]: I1011 10:39:53.096520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-config-volume\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.096702 master-0 kubenswrapper[4790]: I1011 10:39:53.096541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-config\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.096702 master-0 kubenswrapper[4790]: I1011 10:39:53.096564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-serving-cert\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.096702 master-0 kubenswrapper[4790]: I1011 10:39:53.096586 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df4afece-b896-4fea-8b5f-ccebc400ee9f-cert\") pod \"ingress-canary-6xnjz\" (UID: \"df4afece-b896-4fea-8b5f-ccebc400ee9f\") " pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.096702 master-0 kubenswrapper[4790]: I1011 10:39:53.096652 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fph6m\" (UniqueName: \"kubernetes.io/projected/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-kube-api-access-fph6m\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.096702 master-0 kubenswrapper[4790]: I1011 10:39:53.096722 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-client-ca\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-metrics-tls\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096794 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67ae8836-ab0a-4b32-acc6-f828c159c96e-host-slash\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096845 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xmh\" (UniqueName: \"kubernetes.io/projected/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-kube-api-access-g8xmh\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67ae8836-ab0a-4b32-acc6-f828c159c96e-iptables-alerter-script\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-client-ca\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096933 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv9ss\" (UniqueName: \"kubernetes.io/projected/df4afece-b896-4fea-8b5f-ccebc400ee9f-kube-api-access-rv9ss\") pod \"ingress-canary-6xnjz\" (UID: \"df4afece-b896-4fea-8b5f-ccebc400ee9f\") " pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.096982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-proxy-ca-bundles\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.097033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlfkw\" (UniqueName: \"kubernetes.io/projected/67ae8836-ab0a-4b32-acc6-f828c159c96e-kube-api-access-jlfkw\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.097074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-serving-cert\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.098015 master-0 kubenswrapper[4790]: I1011 10:39:53.097101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wclz\" (UniqueName: \"kubernetes.io/projected/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-kube-api-access-6wclz\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.139731 master-0 kubenswrapper[4790]: I1011 10:39:53.139633 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cpn6z"] Oct 11 10:39:53.140373 master-0 kubenswrapper[4790]: I1011 10:39:53.140313 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.143327 master-0 kubenswrapper[4790]: I1011 10:39:53.143172 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-xzm85" Oct 11 10:39:53.144094 master-0 kubenswrapper[4790]: I1011 10:39:53.144029 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 11 10:39:53.144827 master-0 kubenswrapper[4790]: I1011 10:39:53.144275 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 11 10:39:53.198126 master-0 kubenswrapper[4790]: I1011 10:39:53.197980 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-serving-cert\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.198126 master-0 kubenswrapper[4790]: I1011 10:39:53.198073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df4afece-b896-4fea-8b5f-ccebc400ee9f-cert\") pod \"ingress-canary-6xnjz\" (UID: \"df4afece-b896-4fea-8b5f-ccebc400ee9f\") " pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fph6m\" (UniqueName: \"kubernetes.io/projected/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-kube-api-access-fph6m\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198177 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-client-ca\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-metrics-tls\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67ae8836-ab0a-4b32-acc6-f828c159c96e-host-slash\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xmh\" (UniqueName: \"kubernetes.io/projected/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-kube-api-access-g8xmh\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67ae8836-ab0a-4b32-acc6-f828c159c96e-iptables-alerter-script\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-client-ca\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv9ss\" (UniqueName: \"kubernetes.io/projected/df4afece-b896-4fea-8b5f-ccebc400ee9f-kube-api-access-rv9ss\") pod \"ingress-canary-6xnjz\" (UID: \"df4afece-b896-4fea-8b5f-ccebc400ee9f\") " pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-proxy-ca-bundles\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlfkw\" (UniqueName: \"kubernetes.io/projected/67ae8836-ab0a-4b32-acc6-f828c159c96e-kube-api-access-jlfkw\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198743 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-serving-cert\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wclz\" (UniqueName: \"kubernetes.io/projected/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-kube-api-access-6wclz\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-config\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.198949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-config-volume\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.199000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-config\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.199162 master-0 kubenswrapper[4790]: I1011 10:39:53.199106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67ae8836-ab0a-4b32-acc6-f828c159c96e-host-slash\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.200639 master-0 kubenswrapper[4790]: I1011 10:39:53.200556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67ae8836-ab0a-4b32-acc6-f828c159c96e-iptables-alerter-script\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.201142 master-0 kubenswrapper[4790]: I1011 10:39:53.201057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-config-volume\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.201272 master-0 kubenswrapper[4790]: I1011 10:39:53.201063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-client-ca\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.201532 master-0 kubenswrapper[4790]: I1011 10:39:53.201435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-client-ca\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.201847 master-0 kubenswrapper[4790]: I1011 10:39:53.201727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-proxy-ca-bundles\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.202205 master-0 kubenswrapper[4790]: I1011 10:39:53.202139 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-config\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.202787 master-0 kubenswrapper[4790]: I1011 10:39:53.202695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-config\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.207547 master-0 kubenswrapper[4790]: I1011 10:39:53.207064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-metrics-tls\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.207703 master-0 kubenswrapper[4790]: I1011 10:39:53.207357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df4afece-b896-4fea-8b5f-ccebc400ee9f-cert\") pod \"ingress-canary-6xnjz\" (UID: \"df4afece-b896-4fea-8b5f-ccebc400ee9f\") " pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.208016 master-0 kubenswrapper[4790]: I1011 10:39:53.207950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-serving-cert\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.208489 master-0 kubenswrapper[4790]: I1011 10:39:53.208420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-serving-cert\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.220178 master-0 kubenswrapper[4790]: I1011 10:39:53.220105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wclz\" (UniqueName: \"kubernetes.io/projected/9c1b597b-dba4-4011-9acd-e6d40ed8aea4-kube-api-access-6wclz\") pod \"dns-default-xznwp\" (UID: \"9c1b597b-dba4-4011-9acd-e6d40ed8aea4\") " pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.222664 master-0 kubenswrapper[4790]: I1011 10:39:53.222566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlfkw\" (UniqueName: \"kubernetes.io/projected/67ae8836-ab0a-4b32-acc6-f828c159c96e-kube-api-access-jlfkw\") pod \"iptables-alerter-dqfsj\" (UID: \"67ae8836-ab0a-4b32-acc6-f828c159c96e\") " pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.223952 master-0 kubenswrapper[4790]: I1011 10:39:53.223868 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fph6m\" (UniqueName: \"kubernetes.io/projected/8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a-kube-api-access-fph6m\") pod \"controller-manager-897b595f-xctr8\" (UID: \"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a\") " pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.227369 master-0 kubenswrapper[4790]: I1011 10:39:53.227222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv9ss\" (UniqueName: \"kubernetes.io/projected/df4afece-b896-4fea-8b5f-ccebc400ee9f-kube-api-access-rv9ss\") pod \"ingress-canary-6xnjz\" (UID: \"df4afece-b896-4fea-8b5f-ccebc400ee9f\") " pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.232035 master-0 kubenswrapper[4790]: I1011 10:39:53.231961 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xmh\" (UniqueName: \"kubernetes.io/projected/dd5837b4-2687-45b6-b9d5-6ef37d7d47fc-kube-api-access-g8xmh\") pod \"route-controller-manager-57c8488cd7-czzdv\" (UID: \"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc\") " pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.243554 master-0 kubenswrapper[4790]: I1011 10:39:53.243472 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:53.251474 master-0 kubenswrapper[4790]: I1011 10:39:53.251411 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:53.296839 master-0 kubenswrapper[4790]: I1011 10:39:53.291794 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:39:53.296839 master-0 kubenswrapper[4790]: I1011 10:39:53.295752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-xbqxb" Oct 11 10:39:53.296839 master-0 kubenswrapper[4790]: I1011 10:39:53.296080 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Oct 11 10:39:53.309095 master-0 kubenswrapper[4790]: I1011 10:39:53.299817 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-node-bootstrap-token\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.309095 master-0 kubenswrapper[4790]: I1011 10:39:53.299879 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-certs\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.309095 master-0 kubenswrapper[4790]: I1011 10:39:53.300411 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6rjm\" (UniqueName: \"kubernetes.io/projected/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-kube-api-access-k6rjm\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.314531 master-0 kubenswrapper[4790]: I1011 10:39:53.314232 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6xnjz" Oct 11 10:39:53.366208 master-0 kubenswrapper[4790]: I1011 10:39:53.362670 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:53.378364 master-0 kubenswrapper[4790]: I1011 10:39:53.378251 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dqfsj" Oct 11 10:39:53.401837 master-0 kubenswrapper[4790]: I1011 10:39:53.401783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6rjm\" (UniqueName: \"kubernetes.io/projected/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-kube-api-access-k6rjm\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.401944 master-0 kubenswrapper[4790]: I1011 10:39:53.401898 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-node-bootstrap-token\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.401944 master-0 kubenswrapper[4790]: I1011 10:39:53.401928 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-certs\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.410348 master-0 kubenswrapper[4790]: I1011 10:39:53.410306 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-certs\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.411400 master-0 kubenswrapper[4790]: I1011 10:39:53.411333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-node-bootstrap-token\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.442147 master-0 kubenswrapper[4790]: I1011 10:39:53.442100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6rjm\" (UniqueName: \"kubernetes.io/projected/4ce595b2-b8f1-40d5-85db-c1ea82bda0c3-kube-api-access-k6rjm\") pod \"machine-config-server-cpn6z\" (UID: \"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3\") " pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.456179 master-0 kubenswrapper[4790]: I1011 10:39:53.456050 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Oct 11 10:39:53.456533 master-0 kubenswrapper[4790]: I1011 10:39:53.456499 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.460020 master-0 kubenswrapper[4790]: I1011 10:39:53.459971 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Oct 11 10:39:53.460295 master-0 kubenswrapper[4790]: I1011 10:39:53.460257 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-js756" Oct 11 10:39:53.467176 master-0 kubenswrapper[4790]: I1011 10:39:53.467143 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cpn6z" Oct 11 10:39:53.474867 master-0 kubenswrapper[4790]: I1011 10:39:53.472376 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Oct 11 10:39:53.537314 master-0 kubenswrapper[4790]: I1011 10:39:53.536640 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-897b595f-xctr8"] Oct 11 10:39:53.548816 master-0 kubenswrapper[4790]: I1011 10:39:53.548763 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv"] Oct 11 10:39:53.549988 master-0 kubenswrapper[4790]: W1011 10:39:53.549930 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5908fc_cbf1_412d_ae91_23e3bbdf2b1a.slice/crio-66f870a5a10335a87cc92b89a6bb3d0d45fb46a25453b56104a6708b70365fe2 WatchSource:0}: Error finding container 66f870a5a10335a87cc92b89a6bb3d0d45fb46a25453b56104a6708b70365fe2: Status 404 returned error can't find the container with id 66f870a5a10335a87cc92b89a6bb3d0d45fb46a25453b56104a6708b70365fe2 Oct 11 10:39:53.566339 master-0 kubenswrapper[4790]: W1011 10:39:53.566247 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5837b4_2687_45b6_b9d5_6ef37d7d47fc.slice/crio-e5586526960a62ff8e99ed810d664ce57e3140abea4f81be691e37ea95bcb805 WatchSource:0}: Error finding container e5586526960a62ff8e99ed810d664ce57e3140abea4f81be691e37ea95bcb805: Status 404 returned error can't find the container with id e5586526960a62ff8e99ed810d664ce57e3140abea4f81be691e37ea95bcb805 Oct 11 10:39:53.571799 master-0 kubenswrapper[4790]: I1011 10:39:53.571760 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6xnjz"] Oct 11 10:39:53.593383 master-0 kubenswrapper[4790]: W1011 10:39:53.593318 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4afece_b896_4fea_8b5f_ccebc400ee9f.slice/crio-64315d7e7fcd2574b74ea824285b5ff4af543c2a46185fc3af4306baf1c75247 WatchSource:0}: Error finding container 64315d7e7fcd2574b74ea824285b5ff4af543c2a46185fc3af4306baf1c75247: Status 404 returned error can't find the container with id 64315d7e7fcd2574b74ea824285b5ff4af543c2a46185fc3af4306baf1c75247 Oct 11 10:39:53.604824 master-0 kubenswrapper[4790]: I1011 10:39:53.604790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f810d826-e11a-4e68-8b42-f9cc96815f6e-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.604897 master-0 kubenswrapper[4790]: I1011 10:39:53.604852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f810d826-e11a-4e68-8b42-f9cc96815f6e-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.619425 master-0 kubenswrapper[4790]: I1011 10:39:53.619386 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xznwp"] Oct 11 10:39:53.625920 master-0 kubenswrapper[4790]: W1011 10:39:53.625867 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1b597b_dba4_4011_9acd_e6d40ed8aea4.slice/crio-033682d36b7a106f3804a482d8511eae40961fbcf552c6f58c6a876c5bdffe1f WatchSource:0}: Error finding container 033682d36b7a106f3804a482d8511eae40961fbcf552c6f58c6a876c5bdffe1f: Status 404 returned error can't find the container with id 033682d36b7a106f3804a482d8511eae40961fbcf552c6f58c6a876c5bdffe1f Oct 11 10:39:53.691557 master-0 kubenswrapper[4790]: I1011 10:39:53.691466 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xznwp" event={"ID":"9c1b597b-dba4-4011-9acd-e6d40ed8aea4","Type":"ContainerStarted","Data":"033682d36b7a106f3804a482d8511eae40961fbcf552c6f58c6a876c5bdffe1f"} Oct 11 10:39:53.693416 master-0 kubenswrapper[4790]: I1011 10:39:53.693368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6xnjz" event={"ID":"df4afece-b896-4fea-8b5f-ccebc400ee9f","Type":"ContainerStarted","Data":"64315d7e7fcd2574b74ea824285b5ff4af543c2a46185fc3af4306baf1c75247"} Oct 11 10:39:53.695395 master-0 kubenswrapper[4790]: I1011 10:39:53.695350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cpn6z" event={"ID":"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3","Type":"ContainerStarted","Data":"cbcffed3e591804c0115b4d5d01e0e54f333f6c8f5c24286210a94589ba8b0c8"} Oct 11 10:39:53.695491 master-0 kubenswrapper[4790]: I1011 10:39:53.695404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cpn6z" event={"ID":"4ce595b2-b8f1-40d5-85db-c1ea82bda0c3","Type":"ContainerStarted","Data":"c7ce1fa5bd22e7dc35a2712ea20ec03215ba2152098c2d197ffbd414f9c01f1c"} Oct 11 10:39:53.698162 master-0 kubenswrapper[4790]: I1011 10:39:53.698081 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dqfsj" event={"ID":"67ae8836-ab0a-4b32-acc6-f828c159c96e","Type":"ContainerStarted","Data":"eca40f1cc150e140c4b9eb4dd21f799f353cf62a23358250afef8d5640679c67"} Oct 11 10:39:53.699102 master-0 kubenswrapper[4790]: I1011 10:39:53.699054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" event={"ID":"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc","Type":"ContainerStarted","Data":"e5586526960a62ff8e99ed810d664ce57e3140abea4f81be691e37ea95bcb805"} Oct 11 10:39:53.700755 master-0 kubenswrapper[4790]: I1011 10:39:53.700631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" event={"ID":"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a","Type":"ContainerStarted","Data":"66f870a5a10335a87cc92b89a6bb3d0d45fb46a25453b56104a6708b70365fe2"} Oct 11 10:39:53.705748 master-0 kubenswrapper[4790]: I1011 10:39:53.705666 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f810d826-e11a-4e68-8b42-f9cc96815f6e-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.705748 master-0 kubenswrapper[4790]: I1011 10:39:53.705743 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f810d826-e11a-4e68-8b42-f9cc96815f6e-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.706310 master-0 kubenswrapper[4790]: I1011 10:39:53.706162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f810d826-e11a-4e68-8b42-f9cc96815f6e-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.740228 master-0 kubenswrapper[4790]: I1011 10:39:53.740137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f810d826-e11a-4e68-8b42-f9cc96815f6e-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:53.807189 master-0 kubenswrapper[4790]: I1011 10:39:53.807073 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:39:54.003587 master-0 kubenswrapper[4790]: I1011 10:39:54.003423 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cpn6z" podStartSLOduration=1.003397762 podStartE2EDuration="1.003397762s" podCreationTimestamp="2025-10-11 10:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:39:53.72411558 +0000 UTC m=+70.278575872" watchObservedRunningTime="2025-10-11 10:39:54.003397762 +0000 UTC m=+70.557858054" Oct 11 10:39:54.004129 master-0 kubenswrapper[4790]: I1011 10:39:54.004098 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Oct 11 10:39:54.011890 master-0 kubenswrapper[4790]: W1011 10:39:54.011812 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf810d826_e11a_4e68_8b42_f9cc96815f6e.slice/crio-0037715e465467c850978e63ab18a53529c91d6f5e89e891c58217bd8c626b4a WatchSource:0}: Error finding container 0037715e465467c850978e63ab18a53529c91d6f5e89e891c58217bd8c626b4a: Status 404 returned error can't find the container with id 0037715e465467c850978e63ab18a53529c91d6f5e89e891c58217bd8c626b4a Oct 11 10:39:54.292332 master-0 kubenswrapper[4790]: I1011 10:39:54.292174 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:39:54.292332 master-0 kubenswrapper[4790]: I1011 10:39:54.292259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:39:54.301701 master-0 kubenswrapper[4790]: I1011 10:39:54.301266 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 11 10:39:54.301701 master-0 kubenswrapper[4790]: I1011 10:39:54.301385 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-diagnostics"/"default-dockercfg-wbrmn" Oct 11 10:39:54.301701 master-0 kubenswrapper[4790]: I1011 10:39:54.301438 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-hgln7" Oct 11 10:39:54.302278 master-0 kubenswrapper[4790]: I1011 10:39:54.302102 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 11 10:39:54.302278 master-0 kubenswrapper[4790]: I1011 10:39:54.302144 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 11 10:39:54.704733 master-0 kubenswrapper[4790]: I1011 10:39:54.704628 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"f810d826-e11a-4e68-8b42-f9cc96815f6e","Type":"ContainerStarted","Data":"0037715e465467c850978e63ab18a53529c91d6f5e89e891c58217bd8c626b4a"} Oct 11 10:39:55.454539 master-0 kubenswrapper[4790]: I1011 10:39:55.454431 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Oct 11 10:39:55.455511 master-0 kubenswrapper[4790]: I1011 10:39:55.455352 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.466771 master-0 kubenswrapper[4790]: I1011 10:39:55.466723 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Oct 11 10:39:55.526765 master-0 kubenswrapper[4790]: I1011 10:39:55.526651 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6dd18b40-5213-44f7-83cd-99076fb3ee73-kube-api-access\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.526765 master-0 kubenswrapper[4790]: I1011 10:39:55.526744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.526765 master-0 kubenswrapper[4790]: I1011 10:39:55.526765 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-var-lock\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.628269 master-0 kubenswrapper[4790]: I1011 10:39:55.628034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6dd18b40-5213-44f7-83cd-99076fb3ee73-kube-api-access\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.628269 master-0 kubenswrapper[4790]: I1011 10:39:55.628115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.628269 master-0 kubenswrapper[4790]: I1011 10:39:55.628140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-var-lock\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.628269 master-0 kubenswrapper[4790]: I1011 10:39:55.628207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-var-lock\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.628685 master-0 kubenswrapper[4790]: I1011 10:39:55.628293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.654253 master-0 kubenswrapper[4790]: I1011 10:39:55.654141 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6dd18b40-5213-44f7-83cd-99076fb3ee73-kube-api-access\") pod \"installer-6-master-0\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:55.780843 master-0 kubenswrapper[4790]: I1011 10:39:55.780574 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:39:57.721458 master-0 kubenswrapper[4790]: I1011 10:39:57.721391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dqfsj" event={"ID":"67ae8836-ab0a-4b32-acc6-f828c159c96e","Type":"ContainerStarted","Data":"34205c3f5946e7f31a2b129497975903b84c35a1a6d98e69c55d0a92c77a2d1f"} Oct 11 10:39:58.005461 master-0 kubenswrapper[4790]: I1011 10:39:58.005431 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Oct 11 10:39:58.007877 master-0 kubenswrapper[4790]: I1011 10:39:58.007792 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dqfsj" podStartSLOduration=5.007760046 podStartE2EDuration="5.007760046s" podCreationTimestamp="2025-10-11 10:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:39:58.00522349 +0000 UTC m=+74.559683802" watchObservedRunningTime="2025-10-11 10:39:58.007760046 +0000 UTC m=+74.562220378" Oct 11 10:39:58.730652 master-0 kubenswrapper[4790]: I1011 10:39:58.730528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"6dd18b40-5213-44f7-83cd-99076fb3ee73","Type":"ContainerStarted","Data":"f810af855d03b458d1c2e2f8afd6d54238f19e74d825ff17da48e7f4eba7e4c6"} Oct 11 10:39:58.730652 master-0 kubenswrapper[4790]: I1011 10:39:58.730608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"6dd18b40-5213-44f7-83cd-99076fb3ee73","Type":"ContainerStarted","Data":"65085622d36bce3903d45075fbfade9a38ec4d90dad3e9cbfb565e4e9d566b71"} Oct 11 10:39:58.736078 master-0 kubenswrapper[4790]: I1011 10:39:58.735977 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xznwp" event={"ID":"9c1b597b-dba4-4011-9acd-e6d40ed8aea4","Type":"ContainerStarted","Data":"95e83832bbcdda9ddb74bf91b486535b14f3c77c85ad6272287c3b59b03885bf"} Oct 11 10:39:58.736078 master-0 kubenswrapper[4790]: I1011 10:39:58.736063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xznwp" event={"ID":"9c1b597b-dba4-4011-9acd-e6d40ed8aea4","Type":"ContainerStarted","Data":"e91ba8938ccac1d9cbe66fa44dcbe3d0380a800c8dcfe1774f11b62abeee6e4e"} Oct 11 10:39:58.736296 master-0 kubenswrapper[4790]: I1011 10:39:58.736169 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-xznwp" Oct 11 10:39:58.738087 master-0 kubenswrapper[4790]: I1011 10:39:58.738009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6xnjz" event={"ID":"df4afece-b896-4fea-8b5f-ccebc400ee9f","Type":"ContainerStarted","Data":"aeb8dee2c340e2c1d5e45bb0bff615f0896e5e0fb1827f9885c0ba07ca524cd1"} Oct 11 10:39:58.740641 master-0 kubenswrapper[4790]: I1011 10:39:58.740559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" event={"ID":"dd5837b4-2687-45b6-b9d5-6ef37d7d47fc","Type":"ContainerStarted","Data":"105b182c33715a1ffd6ad7be3a307bb1ad5281d259a0f997f8006ecb4233f7f3"} Oct 11 10:39:58.741165 master-0 kubenswrapper[4790]: I1011 10:39:58.741104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:58.742788 master-0 kubenswrapper[4790]: I1011 10:39:58.742735 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"f810d826-e11a-4e68-8b42-f9cc96815f6e","Type":"ContainerStarted","Data":"031944b70ff26e5fd30918c1f6363036efb199cbed1980002fd956f601b576fd"} Oct 11 10:39:58.745211 master-0 kubenswrapper[4790]: I1011 10:39:58.745111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" event={"ID":"8c5908fc-cbf1-412d-ae91-23e3bbdf2b1a","Type":"ContainerStarted","Data":"dee15171e48f2f1e829379524605e2654b7f5be51d28a70498094c931593f837"} Oct 11 10:39:58.745469 master-0 kubenswrapper[4790]: I1011 10:39:58.745401 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:58.755214 master-0 kubenswrapper[4790]: I1011 10:39:58.755126 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" Oct 11 10:39:58.822461 master-0 kubenswrapper[4790]: I1011 10:39:58.822393 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" Oct 11 10:39:59.217847 master-0 kubenswrapper[4790]: I1011 10:39:59.217685 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-0" podStartSLOduration=4.217656238 podStartE2EDuration="4.217656238s" podCreationTimestamp="2025-10-11 10:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:39:59.215596595 +0000 UTC m=+75.770056917" watchObservedRunningTime="2025-10-11 10:39:59.217656238 +0000 UTC m=+75.772116570" Oct 11 10:39:59.460843 master-0 kubenswrapper[4790]: I1011 10:39:59.460653 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57c8488cd7-czzdv" podStartSLOduration=9.446136633 podStartE2EDuration="13.460622039s" podCreationTimestamp="2025-10-11 10:39:46 +0000 UTC" firstStartedPulling="2025-10-11 10:39:53.568562776 +0000 UTC m=+70.123023068" lastFinishedPulling="2025-10-11 10:39:57.583048152 +0000 UTC m=+74.137508474" observedRunningTime="2025-10-11 10:39:59.459289234 +0000 UTC m=+76.013749566" watchObservedRunningTime="2025-10-11 10:39:59.460622039 +0000 UTC m=+76.015082401" Oct 11 10:39:59.703462 master-0 kubenswrapper[4790]: I1011 10:39:59.703372 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-656768b4df-9c8k6"] Oct 11 10:39:59.704447 master-0 kubenswrapper[4790]: I1011 10:39:59.704389 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.704913 master-0 kubenswrapper[4790]: I1011 10:39:59.704828 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-69df5d46bc-wjtq5"] Oct 11 10:39:59.705842 master-0 kubenswrapper[4790]: I1011 10:39:59.705802 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.708925 master-0 kubenswrapper[4790]: I1011 10:39:59.708863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-zlnjr" Oct 11 10:39:59.711837 master-0 kubenswrapper[4790]: I1011 10:39:59.711770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 11 10:39:59.711837 master-0 kubenswrapper[4790]: I1011 10:39:59.711802 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 11 10:39:59.723933 master-0 kubenswrapper[4790]: I1011 10:39:59.723397 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724001 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724003 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724134 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-lntq9" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724184 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724016 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724516 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724533 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724552 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.724833 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.725174 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.725251 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.725281 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.725415 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 11 10:39:59.725806 master-0 kubenswrapper[4790]: I1011 10:39:59.725426 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 11 10:39:59.728364 master-0 kubenswrapper[4790]: I1011 10:39:59.727450 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76f8bc4746-9rjdm"] Oct 11 10:39:59.728565 master-0 kubenswrapper[4790]: I1011 10:39:59.728511 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 11 10:39:59.730314 master-0 kubenswrapper[4790]: I1011 10:39:59.730247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.731191 master-0 kubenswrapper[4790]: I1011 10:39:59.731150 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6fccd5ccc-khqd5"] Oct 11 10:39:59.735078 master-0 kubenswrapper[4790]: I1011 10:39:59.734037 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.735078 master-0 kubenswrapper[4790]: I1011 10:39:59.734794 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7d46fcc5c6-n88q4"] Oct 11 10:39:59.735345 master-0 kubenswrapper[4790]: I1011 10:39:59.735269 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.739145 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xznwp" podStartSLOduration=2.7828357329999998 podStartE2EDuration="6.73911668s" podCreationTimestamp="2025-10-11 10:39:53 +0000 UTC" firstStartedPulling="2025-10-11 10:39:53.628166742 +0000 UTC m=+70.182627034" lastFinishedPulling="2025-10-11 10:39:57.584447659 +0000 UTC m=+74.138907981" observedRunningTime="2025-10-11 10:39:59.724670595 +0000 UTC m=+76.279130947" watchObservedRunningTime="2025-10-11 10:39:59.73911668 +0000 UTC m=+76.293577042" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.742010 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.742339 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.742455 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.742589 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-8wmjp" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.742787 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.742875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743422 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743452 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743559 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743650 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743666 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.743934 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.744426 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.744592 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.744611 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745240 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745413 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-279hr" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745419 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745671 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745855 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745918 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.746012 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.745952 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Oct 11 10:39:59.747652 master-0 kubenswrapper[4790]: I1011 10:39:59.746576 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 11 10:39:59.749677 master-0 kubenswrapper[4790]: I1011 10:39:59.748336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-656768b4df-9c8k6"] Oct 11 10:39:59.749677 master-0 kubenswrapper[4790]: I1011 10:39:59.748388 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-69df5d46bc-wjtq5"] Oct 11 10:39:59.750065 master-0 kubenswrapper[4790]: I1011 10:39:59.750026 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-2ocquro0n92lc" Oct 11 10:39:59.758230 master-0 kubenswrapper[4790]: I1011 10:39:59.757411 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_revision-pruner-6-master-0_f810d826-e11a-4e68-8b42-f9cc96815f6e/pruner/0.log" Oct 11 10:39:59.758230 master-0 kubenswrapper[4790]: I1011 10:39:59.757497 4790 generic.go:334] "Generic (PLEG): container finished" podID="f810d826-e11a-4e68-8b42-f9cc96815f6e" containerID="031944b70ff26e5fd30918c1f6363036efb199cbed1980002fd956f601b576fd" exitCode=255 Oct 11 10:39:59.758230 master-0 kubenswrapper[4790]: I1011 10:39:59.757690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"f810d826-e11a-4e68-8b42-f9cc96815f6e","Type":"ContainerDied","Data":"031944b70ff26e5fd30918c1f6363036efb199cbed1980002fd956f601b576fd"} Oct 11 10:39:59.758230 master-0 kubenswrapper[4790]: I1011 10:39:59.757808 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 11 10:39:59.761110 master-0 kubenswrapper[4790]: I1011 10:39:59.760454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 11 10:39:59.767862 master-0 kubenswrapper[4790]: I1011 10:39:59.767434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 11 10:39:59.779395 master-0 kubenswrapper[4790]: I1011 10:39:59.778502 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f8bc4746-9rjdm"] Oct 11 10:39:59.780421 master-0 kubenswrapper[4790]: I1011 10:39:59.780363 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fccd5ccc-khqd5"] Oct 11 10:39:59.785146 master-0 kubenswrapper[4790]: I1011 10:39:59.785070 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d46fcc5c6-n88q4"] Oct 11 10:39:59.790614 master-0 kubenswrapper[4790]: I1011 10:39:59.790058 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-897b595f-xctr8" podStartSLOduration=9.736230014 podStartE2EDuration="13.79004041s" podCreationTimestamp="2025-10-11 10:39:46 +0000 UTC" firstStartedPulling="2025-10-11 10:39:53.552600053 +0000 UTC m=+70.107060345" lastFinishedPulling="2025-10-11 10:39:57.606410449 +0000 UTC m=+74.160870741" observedRunningTime="2025-10-11 10:39:59.78884959 +0000 UTC m=+76.343309922" watchObservedRunningTime="2025-10-11 10:39:59.79004041 +0000 UTC m=+76.344500702" Oct 11 10:39:59.817225 master-0 kubenswrapper[4790]: I1011 10:39:59.817132 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6xnjz" podStartSLOduration=3.828545097 podStartE2EDuration="7.817110562s" podCreationTimestamp="2025-10-11 10:39:52 +0000 UTC" firstStartedPulling="2025-10-11 10:39:53.596407758 +0000 UTC m=+70.150868050" lastFinishedPulling="2025-10-11 10:39:57.584973223 +0000 UTC m=+74.139433515" observedRunningTime="2025-10-11 10:39:59.816209759 +0000 UTC m=+76.370670051" watchObservedRunningTime="2025-10-11 10:39:59.817110562 +0000 UTC m=+76.371570854" Oct 11 10:39:59.842092 master-0 kubenswrapper[4790]: I1011 10:39:59.841937 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/revision-pruner-6-master-0" podStartSLOduration=3.27287778 podStartE2EDuration="6.841898185s" podCreationTimestamp="2025-10-11 10:39:53 +0000 UTC" firstStartedPulling="2025-10-11 10:39:54.014094239 +0000 UTC m=+70.568554541" lastFinishedPulling="2025-10-11 10:39:57.583114654 +0000 UTC m=+74.137574946" observedRunningTime="2025-10-11 10:39:59.839600055 +0000 UTC m=+76.394060357" watchObservedRunningTime="2025-10-11 10:39:59.841898185 +0000 UTC m=+76.396358517" Oct 11 10:39:59.871817 master-0 kubenswrapper[4790]: I1011 10:39:59.871659 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.871817 master-0 kubenswrapper[4790]: I1011 10:39:59.871785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsdf\" (UniqueName: \"kubernetes.io/projected/a6689745-4f25-4776-9f5c-6bfd7abe62a8-kube-api-access-mpsdf\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.871817 master-0 kubenswrapper[4790]: I1011 10:39:59.871846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1254ac82-5820-431e-baeb-3ae7d7997b38-metrics-server-audit-profiles\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.871904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-dir\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.871987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-node-pullsecrets\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-trusted-ca-bundle\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872085 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-policies\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-trusted-ca-bundle\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-secret-metrics-client-certs\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-serving-cert\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-audit-policies\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.872359 master-0 kubenswrapper[4790]: I1011 10:39:59.872366 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-serving-ca\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1254ac82-5820-431e-baeb-3ae7d7997b38-audit-log\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872543 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88h25\" (UniqueName: \"kubernetes.io/projected/099ca022-6e9c-4604-b517-d90713dd6a44-kube-api-access-88h25\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-client-ca-bundle\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872641 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-oauth-config\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-audit\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-audit-dir\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.872900 master-0 kubenswrapper[4790]: I1011 10:39:59.872826 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.873314 master-0 kubenswrapper[4790]: I1011 10:39:59.872958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-encryption-config\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.873314 master-0 kubenswrapper[4790]: I1011 10:39:59.873017 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78q9m\" (UniqueName: \"kubernetes.io/projected/1254ac82-5820-431e-baeb-3ae7d7997b38-kube-api-access-78q9m\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.873314 master-0 kubenswrapper[4790]: I1011 10:39:59.873073 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-trusted-ca-bundle\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.873314 master-0 kubenswrapper[4790]: I1011 10:39:59.873131 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-error\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.873314 master-0 kubenswrapper[4790]: I1011 10:39:59.873189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1254ac82-5820-431e-baeb-3ae7d7997b38-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.873803 master-0 kubenswrapper[4790]: I1011 10:39:59.873769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.873865 master-0 kubenswrapper[4790]: I1011 10:39:59.873835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-serving-ca\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.874039 master-0 kubenswrapper[4790]: I1011 10:39:59.873969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.874120 master-0 kubenswrapper[4790]: I1011 10:39:59.874086 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-encryption-config\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.874182 master-0 kubenswrapper[4790]: I1011 10:39:59.874146 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-login\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.874250 master-0 kubenswrapper[4790]: I1011 10:39:59.874180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kt7\" (UniqueName: \"kubernetes.io/projected/76dd8647-4ad5-4874-b2d2-dee16aab637a-kube-api-access-k2kt7\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.874250 master-0 kubenswrapper[4790]: I1011 10:39:59.874216 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-serving-cert\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.874376 master-0 kubenswrapper[4790]: I1011 10:39:59.874273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-client\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.874376 master-0 kubenswrapper[4790]: I1011 10:39:59.874310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.874376 master-0 kubenswrapper[4790]: I1011 10:39:59.874353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-service-ca\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.874667 master-0 kubenswrapper[4790]: I1011 10:39:59.874614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-image-import-ca\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.874667 master-0 kubenswrapper[4790]: I1011 10:39:59.874660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-config\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.874844 master-0 kubenswrapper[4790]: I1011 10:39:59.874691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-oauth-serving-cert\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.874844 master-0 kubenswrapper[4790]: I1011 10:39:59.874790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-secret-metrics-server-tls\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.875049 master-0 kubenswrapper[4790]: I1011 10:39:59.874979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-session\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.875188 master-0 kubenswrapper[4790]: I1011 10:39:59.875149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.875262 master-0 kubenswrapper[4790]: I1011 10:39:59.875226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6689745-4f25-4776-9f5c-6bfd7abe62a8-audit-dir\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.875262 master-0 kubenswrapper[4790]: I1011 10:39:59.875256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljnp\" (UniqueName: \"kubernetes.io/projected/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-kube-api-access-7ljnp\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.875498 master-0 kubenswrapper[4790]: I1011 10:39:59.875438 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-config\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.875584 master-0 kubenswrapper[4790]: I1011 10:39:59.875508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-serving-cert\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.875660 master-0 kubenswrapper[4790]: I1011 10:39:59.875597 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-client\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-policies\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-trusted-ca-bundle\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-secret-metrics-client-certs\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-serving-cert\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977192 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-node-pullsecrets\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-trusted-ca-bundle\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-audit-policies\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-serving-ca\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977289 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1254ac82-5820-431e-baeb-3ae7d7997b38-audit-log\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88h25\" (UniqueName: \"kubernetes.io/projected/099ca022-6e9c-4604-b517-d90713dd6a44-kube-api-access-88h25\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977325 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-client-ca-bundle\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-oauth-config\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977361 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-audit\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977384 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-audit-dir\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.978918 master-0 kubenswrapper[4790]: I1011 10:39:59.977406 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977425 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-encryption-config\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977442 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78q9m\" (UniqueName: \"kubernetes.io/projected/1254ac82-5820-431e-baeb-3ae7d7997b38-kube-api-access-78q9m\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977459 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-trusted-ca-bundle\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977478 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-error\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1254ac82-5820-431e-baeb-3ae7d7997b38-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-serving-ca\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977555 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-encryption-config\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-login\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kt7\" (UniqueName: \"kubernetes.io/projected/76dd8647-4ad5-4874-b2d2-dee16aab637a-kube-api-access-k2kt7\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-node-pullsecrets\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.977654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-serving-cert\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.978294 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-client\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.978361 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.980749 master-0 kubenswrapper[4790]: I1011 10:39:59.978634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-service-ca\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.978738 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-config\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.978794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-oauth-serving-cert\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.978850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-image-import-ca\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.978893 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-secret-metrics-server-tls\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.978983 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-session\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljnp\" (UniqueName: \"kubernetes.io/projected/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-kube-api-access-7ljnp\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979135 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6689745-4f25-4776-9f5c-6bfd7abe62a8-audit-dir\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-config\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-serving-cert\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-client\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979374 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1254ac82-5820-431e-baeb-3ae7d7997b38-audit-log\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979395 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsdf\" (UniqueName: \"kubernetes.io/projected/a6689745-4f25-4776-9f5c-6bfd7abe62a8-kube-api-access-mpsdf\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.979587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-serving-ca\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.982261 master-0 kubenswrapper[4790]: I1011 10:39:59.980015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-image-import-ca\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.979834 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-audit-policies\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.980965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-config\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.981257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-trusted-ca-bundle\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.981362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6689745-4f25-4776-9f5c-6bfd7abe62a8-audit-dir\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.981879 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-audit-dir\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.982321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1254ac82-5820-431e-baeb-3ae7d7997b38-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.983589 master-0 kubenswrapper[4790]: I1011 10:39:59.982724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-oauth-serving-cert\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.984251 master-0 kubenswrapper[4790]: I1011 10:39:59.983820 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1254ac82-5820-431e-baeb-3ae7d7997b38-metrics-server-audit-profiles\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.984251 master-0 kubenswrapper[4790]: I1011 10:39:59.983874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-dir\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.985205 master-0 kubenswrapper[4790]: I1011 10:39:59.984435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-dir\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.985205 master-0 kubenswrapper[4790]: I1011 10:39:59.984491 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-encryption-config\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.985454 master-0 kubenswrapper[4790]: I1011 10:39:59.985325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-encryption-config\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.985623 master-0 kubenswrapper[4790]: I1011 10:39:59.985527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-trusted-ca-bundle\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.985623 master-0 kubenswrapper[4790]: I1011 10:39:59.985584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1254ac82-5820-431e-baeb-3ae7d7997b38-metrics-server-audit-profiles\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.986000 master-0 kubenswrapper[4790]: I1011 10:39:59.985873 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-config\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.986072 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-client\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.987437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-serving-cert\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.986095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-oauth-config\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.986331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-serving-ca\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.986485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.986268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-audit\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.987557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.987384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.987657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-login\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.987204 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.987797 master-0 kubenswrapper[4790]: I1011 10:39:59.987064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-service-ca\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.988303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.988666 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-serving-cert\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.988691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-secret-metrics-client-certs\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.988746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-policies\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.988840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.989010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-secret-metrics-server-tls\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.989032 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1254ac82-5820-431e-baeb-3ae7d7997b38-client-ca-bundle\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:39:59.989625 master-0 kubenswrapper[4790]: I1011 10:39:59.989403 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-serving-cert\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:39:59.990276 master-0 kubenswrapper[4790]: I1011 10:39:59.990095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-user-template-error\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.990276 master-0 kubenswrapper[4790]: I1011 10:39:59.990158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-trusted-ca-bundle\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:39:59.990412 master-0 kubenswrapper[4790]: I1011 10:39:59.990343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-session\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.991049 master-0 kubenswrapper[4790]: I1011 10:39:59.991004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-client\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:39:59.992114 master-0 kubenswrapper[4790]: I1011 10:39:59.992057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6689745-4f25-4776-9f5c-6bfd7abe62a8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:39:59.999648 master-0 kubenswrapper[4790]: I1011 10:39:59.999570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88h25\" (UniqueName: \"kubernetes.io/projected/099ca022-6e9c-4604-b517-d90713dd6a44-kube-api-access-88h25\") pod \"apiserver-69df5d46bc-wjtq5\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:40:00.002337 master-0 kubenswrapper[4790]: I1011 10:40:00.002283 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kt7\" (UniqueName: \"kubernetes.io/projected/76dd8647-4ad5-4874-b2d2-dee16aab637a-kube-api-access-k2kt7\") pod \"apiserver-656768b4df-9c8k6\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:40:00.005179 master-0 kubenswrapper[4790]: I1011 10:40:00.005119 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78q9m\" (UniqueName: \"kubernetes.io/projected/1254ac82-5820-431e-baeb-3ae7d7997b38-kube-api-access-78q9m\") pod \"metrics-server-7d46fcc5c6-n88q4\" (UID: \"1254ac82-5820-431e-baeb-3ae7d7997b38\") " pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:40:00.005310 master-0 kubenswrapper[4790]: I1011 10:40:00.005198 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljnp\" (UniqueName: \"kubernetes.io/projected/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-kube-api-access-7ljnp\") pod \"console-76f8bc4746-9rjdm\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:40:00.006910 master-0 kubenswrapper[4790]: I1011 10:40:00.006841 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsdf\" (UniqueName: \"kubernetes.io/projected/a6689745-4f25-4776-9f5c-6bfd7abe62a8-kube-api-access-mpsdf\") pod \"oauth-openshift-6fccd5ccc-khqd5\" (UID: \"a6689745-4f25-4776-9f5c-6bfd7abe62a8\") " pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:40:00.042454 master-0 kubenswrapper[4790]: I1011 10:40:00.042341 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:40:00.058083 master-0 kubenswrapper[4790]: I1011 10:40:00.058022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:40:00.076189 master-0 kubenswrapper[4790]: I1011 10:40:00.076072 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:40:00.089547 master-0 kubenswrapper[4790]: I1011 10:40:00.089460 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:40:00.099053 master-0 kubenswrapper[4790]: I1011 10:40:00.098970 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:40:00.288473 master-0 kubenswrapper[4790]: I1011 10:40:00.288409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:40:00.294257 master-0 kubenswrapper[4790]: I1011 10:40:00.293558 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"installer-8-master-0\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " pod="openshift-etcd/installer-8-master-0" Oct 11 10:40:00.508955 master-0 kubenswrapper[4790]: I1011 10:40:00.508786 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:40:00.522564 master-0 kubenswrapper[4790]: I1011 10:40:00.522483 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-656768b4df-9c8k6"] Oct 11 10:40:00.574934 master-0 kubenswrapper[4790]: I1011 10:40:00.574726 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-69df5d46bc-wjtq5"] Oct 11 10:40:00.578769 master-0 kubenswrapper[4790]: I1011 10:40:00.578319 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f8bc4746-9rjdm"] Oct 11 10:40:00.580878 master-0 kubenswrapper[4790]: W1011 10:40:00.580788 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod099ca022_6e9c_4604_b517_d90713dd6a44.slice/crio-1623fe4c5875e03dc8879c8e18034ad8d12a92c81f76f68ae1603f1e4ba99e21 WatchSource:0}: Error finding container 1623fe4c5875e03dc8879c8e18034ad8d12a92c81f76f68ae1603f1e4ba99e21: Status 404 returned error can't find the container with id 1623fe4c5875e03dc8879c8e18034ad8d12a92c81f76f68ae1603f1e4ba99e21 Oct 11 10:40:00.587766 master-0 kubenswrapper[4790]: W1011 10:40:00.587668 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab7a5bf0_d3df_49f7_bd97_a7b9425fe9db.slice/crio-f61d3e1404aecae93207fe8056d63854d603a520f23437c58f022239f1436a77 WatchSource:0}: Error finding container f61d3e1404aecae93207fe8056d63854d603a520f23437c58f022239f1436a77: Status 404 returned error can't find the container with id f61d3e1404aecae93207fe8056d63854d603a520f23437c58f022239f1436a77 Oct 11 10:40:00.591080 master-0 kubenswrapper[4790]: I1011 10:40:00.589854 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d46fcc5c6-n88q4"] Oct 11 10:40:00.591080 master-0 kubenswrapper[4790]: I1011 10:40:00.591020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fccd5ccc-khqd5"] Oct 11 10:40:00.598396 master-0 kubenswrapper[4790]: W1011 10:40:00.598338 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6689745_4f25_4776_9f5c_6bfd7abe62a8.slice/crio-af52309d15964e4023c841a9a9505b6c6b33b2ad039face745f16cc56a858b55 WatchSource:0}: Error finding container af52309d15964e4023c841a9a9505b6c6b33b2ad039face745f16cc56a858b55: Status 404 returned error can't find the container with id af52309d15964e4023c841a9a9505b6c6b33b2ad039face745f16cc56a858b55 Oct 11 10:40:00.598842 master-0 kubenswrapper[4790]: W1011 10:40:00.598824 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1254ac82_5820_431e_baeb_3ae7d7997b38.slice/crio-ec5edfa238e67eb6dc656c72be1dbc1f0397d7a8d80950005bcb4a224f23c220 WatchSource:0}: Error finding container ec5edfa238e67eb6dc656c72be1dbc1f0397d7a8d80950005bcb4a224f23c220: Status 404 returned error can't find the container with id ec5edfa238e67eb6dc656c72be1dbc1f0397d7a8d80950005bcb4a224f23c220 Oct 11 10:40:00.744007 master-0 kubenswrapper[4790]: I1011 10:40:00.743913 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-8-master-0"] Oct 11 10:40:00.752048 master-0 kubenswrapper[4790]: W1011 10:40:00.751936 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda3934355_bb61_4316_b164_05294e12906a.slice/crio-4a1e893e4dcbac435d2da0e68c078d559bd07b9d3b598e66849b7e045c462376 WatchSource:0}: Error finding container 4a1e893e4dcbac435d2da0e68c078d559bd07b9d3b598e66849b7e045c462376: Status 404 returned error can't find the container with id 4a1e893e4dcbac435d2da0e68c078d559bd07b9d3b598e66849b7e045c462376 Oct 11 10:40:00.762096 master-0 kubenswrapper[4790]: I1011 10:40:00.762007 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerStarted","Data":"1623fe4c5875e03dc8879c8e18034ad8d12a92c81f76f68ae1603f1e4ba99e21"} Oct 11 10:40:00.763314 master-0 kubenswrapper[4790]: I1011 10:40:00.763240 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" event={"ID":"76dd8647-4ad5-4874-b2d2-dee16aab637a","Type":"ContainerStarted","Data":"64ce93912fbe2ce263f72579fc62109333989150c0bd59c119eb0bd06f24caa2"} Oct 11 10:40:00.764250 master-0 kubenswrapper[4790]: I1011 10:40:00.764211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-8-master-0" event={"ID":"a3934355-bb61-4316-b164-05294e12906a","Type":"ContainerStarted","Data":"4a1e893e4dcbac435d2da0e68c078d559bd07b9d3b598e66849b7e045c462376"} Oct 11 10:40:00.765193 master-0 kubenswrapper[4790]: I1011 10:40:00.765163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" event={"ID":"1254ac82-5820-431e-baeb-3ae7d7997b38","Type":"ContainerStarted","Data":"ec5edfa238e67eb6dc656c72be1dbc1f0397d7a8d80950005bcb4a224f23c220"} Oct 11 10:40:00.765990 master-0 kubenswrapper[4790]: I1011 10:40:00.765964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f8bc4746-9rjdm" event={"ID":"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db","Type":"ContainerStarted","Data":"f61d3e1404aecae93207fe8056d63854d603a520f23437c58f022239f1436a77"} Oct 11 10:40:00.766950 master-0 kubenswrapper[4790]: I1011 10:40:00.766858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" event={"ID":"a6689745-4f25-4776-9f5c-6bfd7abe62a8","Type":"ContainerStarted","Data":"af52309d15964e4023c841a9a9505b6c6b33b2ad039face745f16cc56a858b55"} Oct 11 10:40:01.166194 master-0 kubenswrapper[4790]: I1011 10:40:01.166111 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_revision-pruner-6-master-0_f810d826-e11a-4e68-8b42-f9cc96815f6e/pruner/0.log" Oct 11 10:40:01.166521 master-0 kubenswrapper[4790]: I1011 10:40:01.166277 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:40:01.200747 master-0 kubenswrapper[4790]: I1011 10:40:01.200638 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f810d826-e11a-4e68-8b42-f9cc96815f6e-kubelet-dir\") pod \"f810d826-e11a-4e68-8b42-f9cc96815f6e\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " Oct 11 10:40:01.201076 master-0 kubenswrapper[4790]: I1011 10:40:01.200800 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f810d826-e11a-4e68-8b42-f9cc96815f6e-kube-api-access\") pod \"f810d826-e11a-4e68-8b42-f9cc96815f6e\" (UID: \"f810d826-e11a-4e68-8b42-f9cc96815f6e\") " Oct 11 10:40:01.201194 master-0 kubenswrapper[4790]: I1011 10:40:01.201006 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f810d826-e11a-4e68-8b42-f9cc96815f6e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f810d826-e11a-4e68-8b42-f9cc96815f6e" (UID: "f810d826-e11a-4e68-8b42-f9cc96815f6e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:40:01.204587 master-0 kubenswrapper[4790]: I1011 10:40:01.204520 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f810d826-e11a-4e68-8b42-f9cc96815f6e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f810d826-e11a-4e68-8b42-f9cc96815f6e" (UID: "f810d826-e11a-4e68-8b42-f9cc96815f6e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:40:01.303202 master-0 kubenswrapper[4790]: I1011 10:40:01.302087 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f810d826-e11a-4e68-8b42-f9cc96815f6e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:01.303202 master-0 kubenswrapper[4790]: I1011 10:40:01.302136 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f810d826-e11a-4e68-8b42-f9cc96815f6e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:01.775652 master-0 kubenswrapper[4790]: I1011 10:40:01.775533 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_revision-pruner-6-master-0_f810d826-e11a-4e68-8b42-f9cc96815f6e/pruner/0.log" Oct 11 10:40:01.776351 master-0 kubenswrapper[4790]: I1011 10:40:01.776305 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Oct 11 10:40:01.776401 master-0 kubenswrapper[4790]: I1011 10:40:01.775686 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"f810d826-e11a-4e68-8b42-f9cc96815f6e","Type":"ContainerDied","Data":"0037715e465467c850978e63ab18a53529c91d6f5e89e891c58217bd8c626b4a"} Oct 11 10:40:01.776793 master-0 kubenswrapper[4790]: I1011 10:40:01.776764 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0037715e465467c850978e63ab18a53529c91d6f5e89e891c58217bd8c626b4a" Oct 11 10:40:02.779865 master-0 kubenswrapper[4790]: I1011 10:40:02.779794 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" event={"ID":"1254ac82-5820-431e-baeb-3ae7d7997b38","Type":"ContainerStarted","Data":"498a71585d2faeaf2e747295cf0d441a20474e8faeb7d0c5a986a626d52eb5b9"} Oct 11 10:40:02.780876 master-0 kubenswrapper[4790]: I1011 10:40:02.780846 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:40:02.782609 master-0 kubenswrapper[4790]: I1011 10:40:02.782580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" event={"ID":"a6689745-4f25-4776-9f5c-6bfd7abe62a8","Type":"ContainerStarted","Data":"0156b9e81795c34b5869a2a45112f238b843886820aa5fe76bba5aad3dd2bdb4"} Oct 11 10:40:02.783077 master-0 kubenswrapper[4790]: I1011 10:40:02.783045 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:40:02.808958 master-0 kubenswrapper[4790]: I1011 10:40:02.808885 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" podStartSLOduration=303.959566786 podStartE2EDuration="5m5.808865899s" podCreationTimestamp="2025-10-11 10:34:57 +0000 UTC" firstStartedPulling="2025-10-11 10:40:00.603156635 +0000 UTC m=+77.157616927" lastFinishedPulling="2025-10-11 10:40:02.452455748 +0000 UTC m=+79.006916040" observedRunningTime="2025-10-11 10:40:02.808732656 +0000 UTC m=+79.363192948" watchObservedRunningTime="2025-10-11 10:40:02.808865899 +0000 UTC m=+79.363326201" Oct 11 10:40:02.838314 master-0 kubenswrapper[4790]: I1011 10:40:02.838257 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" podStartSLOduration=67.941684141 podStartE2EDuration="1m9.83824563s" podCreationTimestamp="2025-10-11 10:38:53 +0000 UTC" firstStartedPulling="2025-10-11 10:40:00.602386794 +0000 UTC m=+77.156847086" lastFinishedPulling="2025-10-11 10:40:02.498948283 +0000 UTC m=+79.053408575" observedRunningTime="2025-10-11 10:40:02.837487301 +0000 UTC m=+79.391947603" watchObservedRunningTime="2025-10-11 10:40:02.83824563 +0000 UTC m=+79.392705922" Oct 11 10:40:03.117142 master-0 kubenswrapper[4790]: I1011 10:40:03.117075 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6fccd5ccc-khqd5" Oct 11 10:40:03.971169 master-0 kubenswrapper[4790]: I1011 10:40:03.971104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 11 10:40:03.971870 master-0 kubenswrapper[4790]: E1011 10:40:03.971272 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f810d826-e11a-4e68-8b42-f9cc96815f6e" containerName="pruner" Oct 11 10:40:03.971870 master-0 kubenswrapper[4790]: I1011 10:40:03.971288 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f810d826-e11a-4e68-8b42-f9cc96815f6e" containerName="pruner" Oct 11 10:40:03.971870 master-0 kubenswrapper[4790]: I1011 10:40:03.971361 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f810d826-e11a-4e68-8b42-f9cc96815f6e" containerName="pruner" Oct 11 10:40:03.972321 master-0 kubenswrapper[4790]: I1011 10:40:03.972296 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:03.974681 master-0 kubenswrapper[4790]: I1011 10:40:03.974636 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Oct 11 10:40:03.975130 master-0 kubenswrapper[4790]: I1011 10:40:03.975100 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Oct 11 10:40:03.975635 master-0 kubenswrapper[4790]: I1011 10:40:03.975613 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Oct 11 10:40:03.976308 master-0 kubenswrapper[4790]: I1011 10:40:03.976275 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Oct 11 10:40:03.976623 master-0 kubenswrapper[4790]: I1011 10:40:03.976383 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Oct 11 10:40:03.976623 master-0 kubenswrapper[4790]: I1011 10:40:03.976492 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Oct 11 10:40:03.976831 master-0 kubenswrapper[4790]: I1011 10:40:03.976804 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Oct 11 10:40:03.986990 master-0 kubenswrapper[4790]: I1011 10:40:03.986951 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Oct 11 10:40:04.005858 master-0 kubenswrapper[4790]: I1011 10:40:04.005813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 11 10:40:04.130921 master-0 kubenswrapper[4790]: I1011 10:40:04.130825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-web-config\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.130921 master-0 kubenswrapper[4790]: I1011 10:40:04.130912 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3e6a069-f9e0-417c-9226-5ef929699b39-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.130921 master-0 kubenswrapper[4790]: I1011 10:40:04.130937 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3e6a069-f9e0-417c-9226-5ef929699b39-config-out\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131276 master-0 kubenswrapper[4790]: I1011 10:40:04.130954 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131276 master-0 kubenswrapper[4790]: I1011 10:40:04.130974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131276 master-0 kubenswrapper[4790]: I1011 10:40:04.131016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3e6a069-f9e0-417c-9226-5ef929699b39-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131276 master-0 kubenswrapper[4790]: I1011 10:40:04.131203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131276 master-0 kubenswrapper[4790]: I1011 10:40:04.131256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131411 master-0 kubenswrapper[4790]: I1011 10:40:04.131284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncg7c\" (UniqueName: \"kubernetes.io/projected/e3e6a069-f9e0-417c-9226-5ef929699b39-kube-api-access-ncg7c\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131411 master-0 kubenswrapper[4790]: I1011 10:40:04.131322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e3e6a069-f9e0-417c-9226-5ef929699b39-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131411 master-0 kubenswrapper[4790]: I1011 10:40:04.131347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e6a069-f9e0-417c-9226-5ef929699b39-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.131411 master-0 kubenswrapper[4790]: I1011 10:40:04.131394 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-config-volume\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.231983 master-0 kubenswrapper[4790]: I1011 10:40:04.231849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.231983 master-0 kubenswrapper[4790]: I1011 10:40:04.231915 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.231983 master-0 kubenswrapper[4790]: I1011 10:40:04.231950 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncg7c\" (UniqueName: \"kubernetes.io/projected/e3e6a069-f9e0-417c-9226-5ef929699b39-kube-api-access-ncg7c\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.231983 master-0 kubenswrapper[4790]: I1011 10:40:04.231986 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e3e6a069-f9e0-417c-9226-5ef929699b39-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e6a069-f9e0-417c-9226-5ef929699b39-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-config-volume\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-web-config\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232108 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3e6a069-f9e0-417c-9226-5ef929699b39-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232135 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3e6a069-f9e0-417c-9226-5ef929699b39-config-out\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.232278 master-0 kubenswrapper[4790]: I1011 10:40:04.232219 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3e6a069-f9e0-417c-9226-5ef929699b39-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.233623 master-0 kubenswrapper[4790]: I1011 10:40:04.233117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e3e6a069-f9e0-417c-9226-5ef929699b39-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.233623 master-0 kubenswrapper[4790]: I1011 10:40:04.233160 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e6a069-f9e0-417c-9226-5ef929699b39-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.233623 master-0 kubenswrapper[4790]: I1011 10:40:04.233561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3e6a069-f9e0-417c-9226-5ef929699b39-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.235587 master-0 kubenswrapper[4790]: I1011 10:40:04.235553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.235953 master-0 kubenswrapper[4790]: I1011 10:40:04.235902 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-config-volume\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.236049 master-0 kubenswrapper[4790]: I1011 10:40:04.236005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.236260 master-0 kubenswrapper[4790]: I1011 10:40:04.236229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3e6a069-f9e0-417c-9226-5ef929699b39-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.236260 master-0 kubenswrapper[4790]: I1011 10:40:04.236239 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3e6a069-f9e0-417c-9226-5ef929699b39-config-out\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.236810 master-0 kubenswrapper[4790]: I1011 10:40:04.236650 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.237483 master-0 kubenswrapper[4790]: I1011 10:40:04.237451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-web-config\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.243902 master-0 kubenswrapper[4790]: I1011 10:40:04.243512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e3e6a069-f9e0-417c-9226-5ef929699b39-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.254384 master-0 kubenswrapper[4790]: I1011 10:40:04.254316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncg7c\" (UniqueName: \"kubernetes.io/projected/e3e6a069-f9e0-417c-9226-5ef929699b39-kube-api-access-ncg7c\") pod \"alertmanager-main-0\" (UID: \"e3e6a069-f9e0-417c-9226-5ef929699b39\") " pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.285659 master-0 kubenswrapper[4790]: I1011 10:40:04.285569 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:04.635757 master-0 kubenswrapper[4790]: I1011 10:40:04.635615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:40:04.640990 master-0 kubenswrapper[4790]: I1011 10:40:04.640914 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b695d5-a88c-4ff9-bc59-d13f61f237f6-metrics-certs\") pod \"network-metrics-daemon-zcc4t\" (UID: \"a5b695d5-a88c-4ff9-bc59-d13f61f237f6\") " pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:40:04.736516 master-0 kubenswrapper[4790]: I1011 10:40:04.736401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:40:04.742833 master-0 kubenswrapper[4790]: I1011 10:40:04.742775 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlkt\" (UniqueName: \"kubernetes.io/projected/0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7-kube-api-access-8xlkt\") pod \"network-check-target-bn2sv\" (UID: \"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7\") " pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:40:04.813205 master-0 kubenswrapper[4790]: I1011 10:40:04.813118 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:40:04.822236 master-0 kubenswrapper[4790]: I1011 10:40:04.822180 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcc4t" Oct 11 10:40:04.929194 master-0 kubenswrapper[4790]: I1011 10:40:04.929042 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w"] Oct 11 10:40:04.930171 master-0 kubenswrapper[4790]: I1011 10:40:04.930126 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:04.936920 master-0 kubenswrapper[4790]: I1011 10:40:04.936868 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Oct 11 10:40:04.937180 master-0 kubenswrapper[4790]: I1011 10:40:04.937157 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Oct 11 10:40:04.937325 master-0 kubenswrapper[4790]: I1011 10:40:04.937303 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Oct 11 10:40:04.937468 master-0 kubenswrapper[4790]: I1011 10:40:04.937445 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Oct 11 10:40:04.937596 master-0 kubenswrapper[4790]: I1011 10:40:04.937581 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-92o819hatg7mp" Oct 11 10:40:04.937765 master-0 kubenswrapper[4790]: I1011 10:40:04.937750 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Oct 11 10:40:04.952496 master-0 kubenswrapper[4790]: I1011 10:40:04.952438 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w"] Oct 11 10:40:05.040554 master-0 kubenswrapper[4790]: I1011 10:40:05.040489 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4txm\" (UniqueName: \"kubernetes.io/projected/06012e2a-b507-48ad-9740-2c3cb3af5bdf-kube-api-access-g4txm\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041225 master-0 kubenswrapper[4790]: I1011 10:40:05.041201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041367 master-0 kubenswrapper[4790]: I1011 10:40:05.041348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041482 master-0 kubenswrapper[4790]: I1011 10:40:05.041463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-grpc-tls\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041589 master-0 kubenswrapper[4790]: I1011 10:40:05.041570 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-tls\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041700 master-0 kubenswrapper[4790]: I1011 10:40:05.041683 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041825 master-0 kubenswrapper[4790]: I1011 10:40:05.041808 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06012e2a-b507-48ad-9740-2c3cb3af5bdf-metrics-client-ca\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.041981 master-0 kubenswrapper[4790]: I1011 10:40:05.041943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143529 master-0 kubenswrapper[4790]: I1011 10:40:05.143440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143529 master-0 kubenswrapper[4790]: I1011 10:40:05.143540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06012e2a-b507-48ad-9740-2c3cb3af5bdf-metrics-client-ca\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143896 master-0 kubenswrapper[4790]: I1011 10:40:05.143590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143896 master-0 kubenswrapper[4790]: I1011 10:40:05.143634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4txm\" (UniqueName: \"kubernetes.io/projected/06012e2a-b507-48ad-9740-2c3cb3af5bdf-kube-api-access-g4txm\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143896 master-0 kubenswrapper[4790]: I1011 10:40:05.143737 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143896 master-0 kubenswrapper[4790]: I1011 10:40:05.143784 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143896 master-0 kubenswrapper[4790]: I1011 10:40:05.143820 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-grpc-tls\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.143896 master-0 kubenswrapper[4790]: I1011 10:40:05.143890 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-tls\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.145911 master-0 kubenswrapper[4790]: I1011 10:40:05.145839 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06012e2a-b507-48ad-9740-2c3cb3af5bdf-metrics-client-ca\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.147463 master-0 kubenswrapper[4790]: I1011 10:40:05.147409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.148222 master-0 kubenswrapper[4790]: I1011 10:40:05.148165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-tls\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.148522 master-0 kubenswrapper[4790]: I1011 10:40:05.148470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.149433 master-0 kubenswrapper[4790]: I1011 10:40:05.149361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-grpc-tls\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.150350 master-0 kubenswrapper[4790]: I1011 10:40:05.150296 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.150531 master-0 kubenswrapper[4790]: I1011 10:40:05.150482 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/06012e2a-b507-48ad-9740-2c3cb3af5bdf-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.189425 master-0 kubenswrapper[4790]: I1011 10:40:05.189286 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4txm\" (UniqueName: \"kubernetes.io/projected/06012e2a-b507-48ad-9740-2c3cb3af5bdf-kube-api-access-g4txm\") pod \"thanos-querier-7f646dd4d8-qxd8w\" (UID: \"06012e2a-b507-48ad-9740-2c3cb3af5bdf\") " pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:05.243978 master-0 kubenswrapper[4790]: I1011 10:40:05.243904 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:06.568118 master-0 kubenswrapper[4790]: I1011 10:40:06.568022 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bn2sv"] Oct 11 10:40:06.629448 master-0 kubenswrapper[4790]: I1011 10:40:06.629372 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zcc4t"] Oct 11 10:40:06.632513 master-0 kubenswrapper[4790]: I1011 10:40:06.632418 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 11 10:40:06.639586 master-0 kubenswrapper[4790]: I1011 10:40:06.639517 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w"] Oct 11 10:40:06.801694 master-0 kubenswrapper[4790]: I1011 10:40:06.801576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f8bc4746-9rjdm" event={"ID":"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db","Type":"ContainerStarted","Data":"199044355c2ca69bb6ac6dacb80814ae2bdf5a58183730d9a6202cd23bded675"} Oct 11 10:40:06.803298 master-0 kubenswrapper[4790]: I1011 10:40:06.803150 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"56f63f58d50b03c90a73f6e1f202479291f916e6fa3121d3502960f8735cf97e"} Oct 11 10:40:06.804559 master-0 kubenswrapper[4790]: I1011 10:40:06.804484 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcc4t" event={"ID":"a5b695d5-a88c-4ff9-bc59-d13f61f237f6","Type":"ContainerStarted","Data":"edd65708c0094454d319cb7b3ab81c3dc1c5277d8b68ade357b86423ea48d1e0"} Oct 11 10:40:06.805793 master-0 kubenswrapper[4790]: I1011 10:40:06.805748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bn2sv" event={"ID":"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7","Type":"ContainerStarted","Data":"a06131f06cae792ba40a375398c0bcae5d23446765886bf2babe865973923793"} Oct 11 10:40:06.808269 master-0 kubenswrapper[4790]: I1011 10:40:06.808189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-8-master-0" event={"ID":"a3934355-bb61-4316-b164-05294e12906a","Type":"ContainerStarted","Data":"7ad4b389f620d673e1c84d3f718fc34561da93dd445afd695e3bc1db0ae8b3cd"} Oct 11 10:40:06.810401 master-0 kubenswrapper[4790]: I1011 10:40:06.810316 4790 generic.go:334] "Generic (PLEG): container finished" podID="099ca022-6e9c-4604-b517-d90713dd6a44" containerID="035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c" exitCode=0 Oct 11 10:40:06.810527 master-0 kubenswrapper[4790]: I1011 10:40:06.810475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerDied","Data":"035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c"} Oct 11 10:40:06.812193 master-0 kubenswrapper[4790]: I1011 10:40:06.812138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"04d89d4a333def01fbc6dde02ce158f9828f674ae99cc8cc97c64c4898850a3c"} Oct 11 10:40:06.814407 master-0 kubenswrapper[4790]: I1011 10:40:06.814349 4790 generic.go:334] "Generic (PLEG): container finished" podID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerID="ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331" exitCode=0 Oct 11 10:40:06.814407 master-0 kubenswrapper[4790]: I1011 10:40:06.814394 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" event={"ID":"76dd8647-4ad5-4874-b2d2-dee16aab637a","Type":"ContainerDied","Data":"ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331"} Oct 11 10:40:06.832721 master-0 kubenswrapper[4790]: I1011 10:40:06.832615 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76f8bc4746-9rjdm" podStartSLOduration=114.240444529 podStartE2EDuration="1m59.832592485s" podCreationTimestamp="2025-10-11 10:38:07 +0000 UTC" firstStartedPulling="2025-10-11 10:40:00.591089291 +0000 UTC m=+77.145549633" lastFinishedPulling="2025-10-11 10:40:06.183237287 +0000 UTC m=+82.737697589" observedRunningTime="2025-10-11 10:40:06.832535923 +0000 UTC m=+83.386996245" watchObservedRunningTime="2025-10-11 10:40:06.832592485 +0000 UTC m=+83.387052787" Oct 11 10:40:06.912865 master-0 kubenswrapper[4790]: I1011 10:40:06.912633 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-8-master-0" podStartSLOduration=17.454828649 podStartE2EDuration="22.91260557s" podCreationTimestamp="2025-10-11 10:39:44 +0000 UTC" firstStartedPulling="2025-10-11 10:40:00.755095495 +0000 UTC m=+77.309555797" lastFinishedPulling="2025-10-11 10:40:06.212872416 +0000 UTC m=+82.767332718" observedRunningTime="2025-10-11 10:40:06.885933487 +0000 UTC m=+83.440393789" watchObservedRunningTime="2025-10-11 10:40:06.91260557 +0000 UTC m=+83.467065862" Oct 11 10:40:07.825861 master-0 kubenswrapper[4790]: I1011 10:40:07.825802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" event={"ID":"76dd8647-4ad5-4874-b2d2-dee16aab637a","Type":"ContainerStarted","Data":"35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c"} Oct 11 10:40:07.830224 master-0 kubenswrapper[4790]: I1011 10:40:07.830177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerStarted","Data":"29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a"} Oct 11 10:40:07.870540 master-0 kubenswrapper[4790]: I1011 10:40:07.870435 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podStartSLOduration=70.225138783 podStartE2EDuration="1m15.870376175s" podCreationTimestamp="2025-10-11 10:38:52 +0000 UTC" firstStartedPulling="2025-10-11 10:40:00.533253602 +0000 UTC m=+77.087713934" lastFinishedPulling="2025-10-11 10:40:06.178491014 +0000 UTC m=+82.732951326" observedRunningTime="2025-10-11 10:40:07.868697671 +0000 UTC m=+84.423157973" watchObservedRunningTime="2025-10-11 10:40:07.870376175 +0000 UTC m=+84.424836477" Oct 11 10:40:08.367851 master-0 kubenswrapper[4790]: I1011 10:40:08.367773 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xznwp" Oct 11 10:40:09.350704 master-0 kubenswrapper[4790]: I1011 10:40:09.350629 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 11 10:40:09.351737 master-0 kubenswrapper[4790]: I1011 10:40:09.351692 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.354450 master-0 kubenswrapper[4790]: I1011 10:40:09.354133 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Oct 11 10:40:09.354450 master-0 kubenswrapper[4790]: I1011 10:40:09.354367 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.355100 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.355108 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.355272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.355568 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.355765 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.355858 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-6sqva262urci3" Oct 11 10:40:09.356731 master-0 kubenswrapper[4790]: I1011 10:40:09.356636 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Oct 11 10:40:09.357162 master-0 kubenswrapper[4790]: I1011 10:40:09.357049 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Oct 11 10:40:09.361148 master-0 kubenswrapper[4790]: I1011 10:40:09.360949 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Oct 11 10:40:09.362760 master-0 kubenswrapper[4790]: I1011 10:40:09.362725 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Oct 11 10:40:09.382591 master-0 kubenswrapper[4790]: I1011 10:40:09.382415 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 11 10:40:09.433617 master-0 kubenswrapper[4790]: I1011 10:40:09.433565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.433992 master-0 kubenswrapper[4790]: I1011 10:40:09.433978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27006098-2092-43c6-97f8-0219e7fc4b81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434150 master-0 kubenswrapper[4790]: I1011 10:40:09.434125 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434270 master-0 kubenswrapper[4790]: I1011 10:40:09.434250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434370 master-0 kubenswrapper[4790]: I1011 10:40:09.434356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434504 master-0 kubenswrapper[4790]: I1011 10:40:09.434488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434639 master-0 kubenswrapper[4790]: I1011 10:40:09.434626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434788 master-0 kubenswrapper[4790]: I1011 10:40:09.434775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.434916 master-0 kubenswrapper[4790]: I1011 10:40:09.434896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435030 master-0 kubenswrapper[4790]: I1011 10:40:09.435018 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27006098-2092-43c6-97f8-0219e7fc4b81-config-out\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435154 master-0 kubenswrapper[4790]: I1011 10:40:09.435140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435272 master-0 kubenswrapper[4790]: I1011 10:40:09.435260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzrlh\" (UniqueName: \"kubernetes.io/projected/27006098-2092-43c6-97f8-0219e7fc4b81-kube-api-access-tzrlh\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435362 master-0 kubenswrapper[4790]: I1011 10:40:09.435350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-web-config\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435432 master-0 kubenswrapper[4790]: I1011 10:40:09.435419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435530 master-0 kubenswrapper[4790]: I1011 10:40:09.435517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435652 master-0 kubenswrapper[4790]: I1011 10:40:09.435639 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435791 master-0 kubenswrapper[4790]: I1011 10:40:09.435777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-config\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.435972 master-0 kubenswrapper[4790]: I1011 10:40:09.435955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-config\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.536997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27006098-2092-43c6-97f8-0219e7fc4b81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537117 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27006098-2092-43c6-97f8-0219e7fc4b81-config-out\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.539919 master-0 kubenswrapper[4790]: I1011 10:40:09.537262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzrlh\" (UniqueName: \"kubernetes.io/projected/27006098-2092-43c6-97f8-0219e7fc4b81-kube-api-access-tzrlh\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.541129 master-0 kubenswrapper[4790]: I1011 10:40:09.537285 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-web-config\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.541129 master-0 kubenswrapper[4790]: I1011 10:40:09.538698 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.541129 master-0 kubenswrapper[4790]: I1011 10:40:09.539024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.541218 master-0 kubenswrapper[4790]: I1011 10:40:09.541143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-web-config\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.541981 master-0 kubenswrapper[4790]: I1011 10:40:09.541848 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.541981 master-0 kubenswrapper[4790]: I1011 10:40:09.541848 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.542460 master-0 kubenswrapper[4790]: I1011 10:40:09.542420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.542726 master-0 kubenswrapper[4790]: I1011 10:40:09.542637 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27006098-2092-43c6-97f8-0219e7fc4b81-config-out\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.542851 master-0 kubenswrapper[4790]: I1011 10:40:09.542754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.542851 master-0 kubenswrapper[4790]: I1011 10:40:09.542771 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-config\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.543299 master-0 kubenswrapper[4790]: I1011 10:40:09.543272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.544274 master-0 kubenswrapper[4790]: I1011 10:40:09.544226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.544274 master-0 kubenswrapper[4790]: I1011 10:40:09.544256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.544920 master-0 kubenswrapper[4790]: I1011 10:40:09.544858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.546018 master-0 kubenswrapper[4790]: I1011 10:40:09.545964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27006098-2092-43c6-97f8-0219e7fc4b81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.546274 master-0 kubenswrapper[4790]: I1011 10:40:09.546196 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27006098-2092-43c6-97f8-0219e7fc4b81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.547061 master-0 kubenswrapper[4790]: I1011 10:40:09.547009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.547812 master-0 kubenswrapper[4790]: I1011 10:40:09.547778 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27006098-2092-43c6-97f8-0219e7fc4b81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.569959 master-0 kubenswrapper[4790]: I1011 10:40:09.569834 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzrlh\" (UniqueName: \"kubernetes.io/projected/27006098-2092-43c6-97f8-0219e7fc4b81-kube-api-access-tzrlh\") pod \"prometheus-k8s-0\" (UID: \"27006098-2092-43c6-97f8-0219e7fc4b81\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.669145 master-0 kubenswrapper[4790]: I1011 10:40:09.669029 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:09.840882 master-0 kubenswrapper[4790]: I1011 10:40:09.840390 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"c588ca030013e7f6856c7ec7d0452ca808693f0797889a7dbc2fa05ba6679004"} Oct 11 10:40:09.842096 master-0 kubenswrapper[4790]: I1011 10:40:09.842005 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcc4t" event={"ID":"a5b695d5-a88c-4ff9-bc59-d13f61f237f6","Type":"ContainerStarted","Data":"a046b19339ceccf7c0076522b83d91b31e1d5940f23ca53ec4f5911c51f8f1ea"} Oct 11 10:40:09.844888 master-0 kubenswrapper[4790]: I1011 10:40:09.844213 4790 generic.go:334] "Generic (PLEG): container finished" podID="e3e6a069-f9e0-417c-9226-5ef929699b39" containerID="029043df943b5138a6cfb7a349576a0a53a8aa434f25983d2a5e8a39c878b25b" exitCode=0 Oct 11 10:40:09.844888 master-0 kubenswrapper[4790]: I1011 10:40:09.844239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerDied","Data":"029043df943b5138a6cfb7a349576a0a53a8aa434f25983d2a5e8a39c878b25b"} Oct 11 10:40:10.043462 master-0 kubenswrapper[4790]: I1011 10:40:10.043369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:40:10.043958 master-0 kubenswrapper[4790]: I1011 10:40:10.043914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:40:10.054281 master-0 kubenswrapper[4790]: I1011 10:40:10.054204 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:40:10.076835 master-0 kubenswrapper[4790]: I1011 10:40:10.076764 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:40:10.076835 master-0 kubenswrapper[4790]: I1011 10:40:10.076832 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:40:10.079178 master-0 kubenswrapper[4790]: I1011 10:40:10.079105 4790 patch_prober.go:28] interesting pod/console-76f8bc4746-9rjdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" start-of-body= Oct 11 10:40:10.079247 master-0 kubenswrapper[4790]: I1011 10:40:10.079208 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" probeResult="failure" output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" Oct 11 10:40:10.891231 master-0 kubenswrapper[4790]: I1011 10:40:10.891189 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:40:11.203901 master-0 kubenswrapper[4790]: I1011 10:40:11.203828 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 11 10:40:11.887936 master-0 kubenswrapper[4790]: I1011 10:40:11.887874 4790 generic.go:334] "Generic (PLEG): container finished" podID="27006098-2092-43c6-97f8-0219e7fc4b81" containerID="f0ccf70a86f9257536be0c3f44809501e528a5a2cbeeb567e1c0b2ce3980447a" exitCode=0 Oct 11 10:40:11.887936 master-0 kubenswrapper[4790]: I1011 10:40:11.887940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerDied","Data":"f0ccf70a86f9257536be0c3f44809501e528a5a2cbeeb567e1c0b2ce3980447a"} Oct 11 10:40:11.888229 master-0 kubenswrapper[4790]: I1011 10:40:11.887970 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"b4da09b1d91add0f61702efa94cfd5fadc0ea9628343d2e5c413b60af2167a29"} Oct 11 10:40:11.892492 master-0 kubenswrapper[4790]: I1011 10:40:11.892412 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"9a560a07bad2152cbf18de2d70b546b10a894a4c9eba4a43930ff340c8389571"} Oct 11 10:40:11.892895 master-0 kubenswrapper[4790]: I1011 10:40:11.892512 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"425a84423cf1ac9f26b84267e7f04d2060fcc86e77e7043fb9e75fb0bad2db16"} Oct 11 10:40:11.897275 master-0 kubenswrapper[4790]: I1011 10:40:11.896548 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcc4t" event={"ID":"a5b695d5-a88c-4ff9-bc59-d13f61f237f6","Type":"ContainerStarted","Data":"b8b375c1d5ba37d0796c0f31dd781a4598802f156f66ac92adb0145d3738abca"} Oct 11 10:40:11.899268 master-0 kubenswrapper[4790]: I1011 10:40:11.899249 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bn2sv" event={"ID":"0c2fae04-4b80-4621-ade0-9cd8fbdd0cd7","Type":"ContainerStarted","Data":"27d5862903c983bff49bd00da7e1422b50957d200b1478fe0e1e327797d15147"} Oct 11 10:40:11.899494 master-0 kubenswrapper[4790]: I1011 10:40:11.899448 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:40:11.902593 master-0 kubenswrapper[4790]: I1011 10:40:11.902535 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerStarted","Data":"dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef"} Oct 11 10:40:11.978033 master-0 kubenswrapper[4790]: I1011 10:40:11.977950 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podStartSLOduration=138.783949883 podStartE2EDuration="2m28.977928445s" podCreationTimestamp="2025-10-11 10:37:43 +0000 UTC" firstStartedPulling="2025-10-11 10:40:00.582969411 +0000 UTC m=+77.137429703" lastFinishedPulling="2025-10-11 10:40:10.776947973 +0000 UTC m=+87.331408265" observedRunningTime="2025-10-11 10:40:11.97694902 +0000 UTC m=+88.531409312" watchObservedRunningTime="2025-10-11 10:40:11.977928445 +0000 UTC m=+88.532388737" Oct 11 10:40:12.001940 master-0 kubenswrapper[4790]: I1011 10:40:12.001830 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zcc4t" podStartSLOduration=79.636162751 podStartE2EDuration="1m22.001800073s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:40:06.653339567 +0000 UTC m=+83.207799859" lastFinishedPulling="2025-10-11 10:40:09.018976869 +0000 UTC m=+85.573437181" observedRunningTime="2025-10-11 10:40:11.99741772 +0000 UTC m=+88.551878012" watchObservedRunningTime="2025-10-11 10:40:12.001800073 +0000 UTC m=+88.556260375" Oct 11 10:40:12.022846 master-0 kubenswrapper[4790]: I1011 10:40:12.022471 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bn2sv" podStartSLOduration=77.91843361 podStartE2EDuration="1m22.022444379s" podCreationTimestamp="2025-10-11 10:38:50 +0000 UTC" firstStartedPulling="2025-10-11 10:40:06.654858416 +0000 UTC m=+83.209318708" lastFinishedPulling="2025-10-11 10:40:10.758869185 +0000 UTC m=+87.313329477" observedRunningTime="2025-10-11 10:40:12.020693443 +0000 UTC m=+88.575153775" watchObservedRunningTime="2025-10-11 10:40:12.022444379 +0000 UTC m=+88.576904671" Oct 11 10:40:12.915449 master-0 kubenswrapper[4790]: I1011 10:40:12.914698 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"9cc67f3249e9ba1a87ad5f1fb5b5b937ef4a43c6916b2e1553d0f6d219913c74"} Oct 11 10:40:12.915449 master-0 kubenswrapper[4790]: I1011 10:40:12.914790 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"1de1ae70cc5d6da8d5e33fb6fe9e8a4293266c4ac10f67debb82207ace9bc36c"} Oct 11 10:40:12.915449 master-0 kubenswrapper[4790]: I1011 10:40:12.914835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"ae83c580c95e3b1f241ed240fae33cfe4a4481b7d9da0de952982e2241ebb676"} Oct 11 10:40:12.915449 master-0 kubenswrapper[4790]: I1011 10:40:12.914852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"038e39a4b41e0edad0ccbd1de82e8d4c932dee16fdbd097c895c31138e8493fb"} Oct 11 10:40:12.915449 master-0 kubenswrapper[4790]: I1011 10:40:12.914862 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"2011a0acce1f7e66c917cb1c7085c139facae2f3a7c84858a4429def26871e32"} Oct 11 10:40:12.915449 master-0 kubenswrapper[4790]: I1011 10:40:12.914899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e3e6a069-f9e0-417c-9226-5ef929699b39","Type":"ContainerStarted","Data":"0001b56c3c2a8034aa4f57fc91cf32f256999750df54773f67e906cba6c30794"} Oct 11 10:40:12.918125 master-0 kubenswrapper[4790]: I1011 10:40:12.918042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"95cdac044629725c44de229a5f3d6aa6b550a67be81219fb460ea5981a0d112a"} Oct 11 10:40:12.918125 master-0 kubenswrapper[4790]: I1011 10:40:12.918082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"9fe8c2d9ab3ed0a51fdc0047414b82caa3d12fa18737aa831769005add687b8b"} Oct 11 10:40:12.918125 master-0 kubenswrapper[4790]: I1011 10:40:12.918095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" event={"ID":"06012e2a-b507-48ad-9740-2c3cb3af5bdf","Type":"ContainerStarted","Data":"b21c9286cb6b0d3c9e3d6135adf5fafc9b8e6c733dbeb8e2935ebf6d3ce36a8f"} Oct 11 10:40:12.918881 master-0 kubenswrapper[4790]: I1011 10:40:12.918793 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:12.964194 master-0 kubenswrapper[4790]: I1011 10:40:12.964108 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.683859549 podStartE2EDuration="9.964077306s" podCreationTimestamp="2025-10-11 10:40:03 +0000 UTC" firstStartedPulling="2025-10-11 10:40:06.654625701 +0000 UTC m=+83.209085993" lastFinishedPulling="2025-10-11 10:40:11.934843458 +0000 UTC m=+88.489303750" observedRunningTime="2025-10-11 10:40:12.962289699 +0000 UTC m=+89.516749991" watchObservedRunningTime="2025-10-11 10:40:12.964077306 +0000 UTC m=+89.518537608" Oct 11 10:40:13.001738 master-0 kubenswrapper[4790]: I1011 10:40:13.001589 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" podStartSLOduration=3.1955185249999998 podStartE2EDuration="9.001556347s" podCreationTimestamp="2025-10-11 10:40:04 +0000 UTC" firstStartedPulling="2025-10-11 10:40:06.65732196 +0000 UTC m=+83.211782282" lastFinishedPulling="2025-10-11 10:40:12.463359782 +0000 UTC m=+89.017820104" observedRunningTime="2025-10-11 10:40:12.999156235 +0000 UTC m=+89.553616567" watchObservedRunningTime="2025-10-11 10:40:13.001556347 +0000 UTC m=+89.556016679" Oct 11 10:40:14.285906 master-0 kubenswrapper[4790]: I1011 10:40:14.285820 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:15.059007 master-0 kubenswrapper[4790]: I1011 10:40:15.058936 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:40:15.059259 master-0 kubenswrapper[4790]: I1011 10:40:15.059125 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:40:15.070899 master-0 kubenswrapper[4790]: I1011 10:40:15.070841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:40:15.257740 master-0 kubenswrapper[4790]: I1011 10:40:15.257264 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f646dd4d8-qxd8w" Oct 11 10:40:15.941586 master-0 kubenswrapper[4790]: I1011 10:40:15.941471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"34518034d2159a78d44e6347c922575227248da01713a7d4cc5aaae730daee11"} Oct 11 10:40:15.941586 master-0 kubenswrapper[4790]: I1011 10:40:15.941567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"45a804de210a3a3a0c8d8956749a7ed5513d5ab5accb2e91de22b101c17bc054"} Oct 11 10:40:15.941586 master-0 kubenswrapper[4790]: I1011 10:40:15.941591 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"fe7380bd6e0c3ce9adab3d9f7cab77bff97e9129678e0baa7b31a6fe54309115"} Oct 11 10:40:15.942379 master-0 kubenswrapper[4790]: I1011 10:40:15.941610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"547c293f85953ae3ff30fa764c110e076ee5e86cd2a7023b3377cd343f88ed87"} Oct 11 10:40:15.942379 master-0 kubenswrapper[4790]: I1011 10:40:15.941630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"b78e7077e53b29ae6c25df1f5e3f838054e54c94e7f01b73db52c5167d296283"} Oct 11 10:40:15.942379 master-0 kubenswrapper[4790]: I1011 10:40:15.941649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27006098-2092-43c6-97f8-0219e7fc4b81","Type":"ContainerStarted","Data":"e0d793fe72abcb80701ddeacfac5bb5f45af601f249867c0837d3ff07394abc9"} Oct 11 10:40:15.950782 master-0 kubenswrapper[4790]: I1011 10:40:15.950682 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:40:15.999313 master-0 kubenswrapper[4790]: I1011 10:40:15.999211 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.117112614 podStartE2EDuration="6.999173686s" podCreationTimestamp="2025-10-11 10:40:09 +0000 UTC" firstStartedPulling="2025-10-11 10:40:11.889234655 +0000 UTC m=+88.443694947" lastFinishedPulling="2025-10-11 10:40:14.771295727 +0000 UTC m=+91.325756019" observedRunningTime="2025-10-11 10:40:15.996968169 +0000 UTC m=+92.551428551" watchObservedRunningTime="2025-10-11 10:40:15.999173686 +0000 UTC m=+92.553633978" Oct 11 10:40:19.669814 master-0 kubenswrapper[4790]: I1011 10:40:19.669655 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:40:20.077945 master-0 kubenswrapper[4790]: I1011 10:40:20.077828 4790 patch_prober.go:28] interesting pod/console-76f8bc4746-9rjdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" start-of-body= Oct 11 10:40:20.078321 master-0 kubenswrapper[4790]: I1011 10:40:20.077943 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" probeResult="failure" output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" Oct 11 10:40:29.825291 master-0 kubenswrapper[4790]: I1011 10:40:29.825194 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Oct 11 10:40:29.826460 master-0 kubenswrapper[4790]: I1011 10:40:29.826135 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:29.870335 master-0 kubenswrapper[4790]: I1011 10:40:29.870202 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Oct 11 10:40:30.004545 master-0 kubenswrapper[4790]: I1011 10:40:30.004437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb4d3d6bdbb62903356b2987e206d2-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c6bb4d3d6bdbb62903356b2987e206d2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.004545 master-0 kubenswrapper[4790]: I1011 10:40:30.004547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb4d3d6bdbb62903356b2987e206d2-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c6bb4d3d6bdbb62903356b2987e206d2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.077542 master-0 kubenswrapper[4790]: I1011 10:40:30.077368 4790 patch_prober.go:28] interesting pod/console-76f8bc4746-9rjdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" start-of-body= Oct 11 10:40:30.077542 master-0 kubenswrapper[4790]: I1011 10:40:30.077475 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" probeResult="failure" output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" Oct 11 10:40:30.106328 master-0 kubenswrapper[4790]: I1011 10:40:30.106256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb4d3d6bdbb62903356b2987e206d2-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c6bb4d3d6bdbb62903356b2987e206d2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.106491 master-0 kubenswrapper[4790]: I1011 10:40:30.106388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb4d3d6bdbb62903356b2987e206d2-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c6bb4d3d6bdbb62903356b2987e206d2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.106491 master-0 kubenswrapper[4790]: I1011 10:40:30.106421 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb4d3d6bdbb62903356b2987e206d2-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c6bb4d3d6bdbb62903356b2987e206d2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.106694 master-0 kubenswrapper[4790]: I1011 10:40:30.106602 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb4d3d6bdbb62903356b2987e206d2-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c6bb4d3d6bdbb62903356b2987e206d2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.168636 master-0 kubenswrapper[4790]: I1011 10:40:30.168540 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:30.190660 master-0 kubenswrapper[4790]: W1011 10:40:30.190581 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bb4d3d6bdbb62903356b2987e206d2.slice/crio-d11c0d73961fef96a17d590ac532686405f30611b8369d332f54216607db5de0 WatchSource:0}: Error finding container d11c0d73961fef96a17d590ac532686405f30611b8369d332f54216607db5de0: Status 404 returned error can't find the container with id d11c0d73961fef96a17d590ac532686405f30611b8369d332f54216607db5de0 Oct 11 10:40:31.022462 master-0 kubenswrapper[4790]: I1011 10:40:31.022395 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c6bb4d3d6bdbb62903356b2987e206d2","Type":"ContainerStarted","Data":"d11c0d73961fef96a17d590ac532686405f30611b8369d332f54216607db5de0"} Oct 11 10:40:31.035384 master-0 kubenswrapper[4790]: I1011 10:40:31.032505 4790 generic.go:334] "Generic (PLEG): container finished" podID="6dd18b40-5213-44f7-83cd-99076fb3ee73" containerID="f810af855d03b458d1c2e2f8afd6d54238f19e74d825ff17da48e7f4eba7e4c6" exitCode=0 Oct 11 10:40:31.035384 master-0 kubenswrapper[4790]: I1011 10:40:31.032593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"6dd18b40-5213-44f7-83cd-99076fb3ee73","Type":"ContainerDied","Data":"f810af855d03b458d1c2e2f8afd6d54238f19e74d825ff17da48e7f4eba7e4c6"} Oct 11 10:40:32.358112 master-0 kubenswrapper[4790]: I1011 10:40:32.358053 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:40:32.534978 master-0 kubenswrapper[4790]: I1011 10:40:32.534859 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-var-lock\") pod \"6dd18b40-5213-44f7-83cd-99076fb3ee73\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " Oct 11 10:40:32.535936 master-0 kubenswrapper[4790]: I1011 10:40:32.535027 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6dd18b40-5213-44f7-83cd-99076fb3ee73-kube-api-access\") pod \"6dd18b40-5213-44f7-83cd-99076fb3ee73\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " Oct 11 10:40:32.535936 master-0 kubenswrapper[4790]: I1011 10:40:32.535089 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-kubelet-dir\") pod \"6dd18b40-5213-44f7-83cd-99076fb3ee73\" (UID: \"6dd18b40-5213-44f7-83cd-99076fb3ee73\") " Oct 11 10:40:32.535936 master-0 kubenswrapper[4790]: I1011 10:40:32.535416 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6dd18b40-5213-44f7-83cd-99076fb3ee73" (UID: "6dd18b40-5213-44f7-83cd-99076fb3ee73"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:40:32.535936 master-0 kubenswrapper[4790]: I1011 10:40:32.535465 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-var-lock" (OuterVolumeSpecName: "var-lock") pod "6dd18b40-5213-44f7-83cd-99076fb3ee73" (UID: "6dd18b40-5213-44f7-83cd-99076fb3ee73"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:40:32.541868 master-0 kubenswrapper[4790]: I1011 10:40:32.541805 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd18b40-5213-44f7-83cd-99076fb3ee73-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6dd18b40-5213-44f7-83cd-99076fb3ee73" (UID: "6dd18b40-5213-44f7-83cd-99076fb3ee73"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:40:32.637621 master-0 kubenswrapper[4790]: I1011 10:40:32.637415 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6dd18b40-5213-44f7-83cd-99076fb3ee73-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:32.637621 master-0 kubenswrapper[4790]: I1011 10:40:32.637500 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:32.637621 master-0 kubenswrapper[4790]: I1011 10:40:32.637528 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6dd18b40-5213-44f7-83cd-99076fb3ee73-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:33.048113 master-0 kubenswrapper[4790]: I1011 10:40:33.047954 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"6dd18b40-5213-44f7-83cd-99076fb3ee73","Type":"ContainerDied","Data":"65085622d36bce3903d45075fbfade9a38ec4d90dad3e9cbfb565e4e9d566b71"} Oct 11 10:40:33.048113 master-0 kubenswrapper[4790]: I1011 10:40:33.048022 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65085622d36bce3903d45075fbfade9a38ec4d90dad3e9cbfb565e4e9d566b71" Oct 11 10:40:33.048113 master-0 kubenswrapper[4790]: I1011 10:40:33.048113 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Oct 11 10:40:34.328633 master-0 kubenswrapper[4790]: I1011 10:40:34.328575 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/alertmanager-main-0" Oct 11 10:40:37.826277 master-0 kubenswrapper[4790]: I1011 10:40:37.826196 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:40:37.826964 master-0 kubenswrapper[4790]: E1011 10:40:37.826476 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd18b40-5213-44f7-83cd-99076fb3ee73" containerName="installer" Oct 11 10:40:37.826964 master-0 kubenswrapper[4790]: I1011 10:40:37.826498 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd18b40-5213-44f7-83cd-99076fb3ee73" containerName="installer" Oct 11 10:40:37.826964 master-0 kubenswrapper[4790]: I1011 10:40:37.826618 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd18b40-5213-44f7-83cd-99076fb3ee73" containerName="installer" Oct 11 10:40:37.828516 master-0 kubenswrapper[4790]: I1011 10:40:37.828476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:37.888842 master-0 kubenswrapper[4790]: I1011 10:40:37.888663 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:40:37.927595 master-0 kubenswrapper[4790]: I1011 10:40:37.927539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-usr-local-bin\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:37.927701 master-0 kubenswrapper[4790]: I1011 10:40:37.927605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-resource-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:37.927701 master-0 kubenswrapper[4790]: I1011 10:40:37.927637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-cert-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:37.927701 master-0 kubenswrapper[4790]: I1011 10:40:37.927660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-static-pod-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:37.927701 master-0 kubenswrapper[4790]: I1011 10:40:37.927692 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-data-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:37.927844 master-0 kubenswrapper[4790]: I1011 10:40:37.927733 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-log-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029244 master-0 kubenswrapper[4790]: I1011 10:40:38.029173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-cert-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029244 master-0 kubenswrapper[4790]: I1011 10:40:38.029258 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-static-pod-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-data-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029390 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-log-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029394 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-cert-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-usr-local-bin\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-data-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-usr-local-bin\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029503 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-static-pod-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029596 master-0 kubenswrapper[4790]: I1011 10:40:38.029572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-resource-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029909 master-0 kubenswrapper[4790]: I1011 10:40:38.029604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-log-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.029909 master-0 kubenswrapper[4790]: I1011 10:40:38.029652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-resource-dir\") pod \"etcd-master-0\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:40:38.069576 master-0 kubenswrapper[4790]: I1011 10:40:38.069509 4790 generic.go:334] "Generic (PLEG): container finished" podID="a3934355-bb61-4316-b164-05294e12906a" containerID="7ad4b389f620d673e1c84d3f718fc34561da93dd445afd695e3bc1db0ae8b3cd" exitCode=0 Oct 11 10:40:38.069576 master-0 kubenswrapper[4790]: I1011 10:40:38.069576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-8-master-0" event={"ID":"a3934355-bb61-4316-b164-05294e12906a","Type":"ContainerDied","Data":"7ad4b389f620d673e1c84d3f718fc34561da93dd445afd695e3bc1db0ae8b3cd"} Oct 11 10:40:38.180836 master-0 kubenswrapper[4790]: I1011 10:40:38.180782 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:39.078610 master-0 kubenswrapper[4790]: I1011 10:40:39.078471 4790 generic.go:334] "Generic (PLEG): container finished" podID="c6bb4d3d6bdbb62903356b2987e206d2" containerID="289df993fad1f4fba2ca17fd7a3cf2133d080a63765dafc8aa2bcf6b7c69fc5b" exitCode=0 Oct 11 10:40:39.079544 master-0 kubenswrapper[4790]: I1011 10:40:39.078616 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c6bb4d3d6bdbb62903356b2987e206d2","Type":"ContainerDied","Data":"289df993fad1f4fba2ca17fd7a3cf2133d080a63765dafc8aa2bcf6b7c69fc5b"} Oct 11 10:40:39.082228 master-0 kubenswrapper[4790]: I1011 10:40:39.082160 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerStarted","Data":"3b919da8ddb7be0dba8b9a6a99bf3d4fe8ae3f53dd95f938e572564de35b4f48"} Oct 11 10:40:39.083812 master-0 kubenswrapper[4790]: I1011 10:40:39.083558 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0"] Oct 11 10:40:39.084250 master-0 kubenswrapper[4790]: I1011 10:40:39.084199 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:39.089399 master-0 kubenswrapper[4790]: I1011 10:40:39.089277 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Oct 11 10:40:39.089789 master-0 kubenswrapper[4790]: I1011 10:40:39.089746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"openshift-service-ca.crt" Oct 11 10:40:39.090034 master-0 kubenswrapper[4790]: I1011 10:40:39.090021 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"default-dockercfg-4hwjx" Oct 11 10:40:39.103691 master-0 kubenswrapper[4790]: I1011 10:40:39.103621 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0"] Oct 11 10:40:39.148252 master-0 kubenswrapper[4790]: I1011 10:40:39.148189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzddm\" (UniqueName: \"kubernetes.io/projected/95f1328a-5ab8-4276-9bfd-55b3dbb2a994-kube-api-access-tzddm\") pod \"openshift-kube-scheduler-guard-master-0\" (UID: \"95f1328a-5ab8-4276-9bfd-55b3dbb2a994\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:39.248801 master-0 kubenswrapper[4790]: I1011 10:40:39.248691 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzddm\" (UniqueName: \"kubernetes.io/projected/95f1328a-5ab8-4276-9bfd-55b3dbb2a994-kube-api-access-tzddm\") pod \"openshift-kube-scheduler-guard-master-0\" (UID: \"95f1328a-5ab8-4276-9bfd-55b3dbb2a994\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:39.273253 master-0 kubenswrapper[4790]: I1011 10:40:39.273213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzddm\" (UniqueName: \"kubernetes.io/projected/95f1328a-5ab8-4276-9bfd-55b3dbb2a994-kube-api-access-tzddm\") pod \"openshift-kube-scheduler-guard-master-0\" (UID: \"95f1328a-5ab8-4276-9bfd-55b3dbb2a994\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:39.388157 master-0 kubenswrapper[4790]: I1011 10:40:39.388116 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:40:39.416385 master-0 kubenswrapper[4790]: I1011 10:40:39.416334 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:39.452747 master-0 kubenswrapper[4790]: I1011 10:40:39.452636 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-kubelet-dir\") pod \"a3934355-bb61-4316-b164-05294e12906a\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " Oct 11 10:40:39.452859 master-0 kubenswrapper[4790]: I1011 10:40:39.452795 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-var-lock\") pod \"a3934355-bb61-4316-b164-05294e12906a\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " Oct 11 10:40:39.452911 master-0 kubenswrapper[4790]: I1011 10:40:39.452895 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") pod \"a3934355-bb61-4316-b164-05294e12906a\" (UID: \"a3934355-bb61-4316-b164-05294e12906a\") " Oct 11 10:40:39.453065 master-0 kubenswrapper[4790]: I1011 10:40:39.453009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a3934355-bb61-4316-b164-05294e12906a" (UID: "a3934355-bb61-4316-b164-05294e12906a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:40:39.453115 master-0 kubenswrapper[4790]: I1011 10:40:39.453097 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-var-lock" (OuterVolumeSpecName: "var-lock") pod "a3934355-bb61-4316-b164-05294e12906a" (UID: "a3934355-bb61-4316-b164-05294e12906a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:40:39.453243 master-0 kubenswrapper[4790]: I1011 10:40:39.453195 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:39.453295 master-0 kubenswrapper[4790]: I1011 10:40:39.453244 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3934355-bb61-4316-b164-05294e12906a-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:39.457585 master-0 kubenswrapper[4790]: I1011 10:40:39.457558 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a3934355-bb61-4316-b164-05294e12906a" (UID: "a3934355-bb61-4316-b164-05294e12906a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:40:39.554173 master-0 kubenswrapper[4790]: I1011 10:40:39.554134 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3934355-bb61-4316-b164-05294e12906a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:40:39.825674 master-0 kubenswrapper[4790]: I1011 10:40:39.825606 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0"] Oct 11 10:40:39.831087 master-0 kubenswrapper[4790]: W1011 10:40:39.831042 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f1328a_5ab8_4276_9bfd_55b3dbb2a994.slice/crio-c55f42cfe8653fa8706646db8a6223894e0bf53deb5fa86dfa59ef2811cbde3d WatchSource:0}: Error finding container c55f42cfe8653fa8706646db8a6223894e0bf53deb5fa86dfa59ef2811cbde3d: Status 404 returned error can't find the container with id c55f42cfe8653fa8706646db8a6223894e0bf53deb5fa86dfa59ef2811cbde3d Oct 11 10:40:40.076952 master-0 kubenswrapper[4790]: I1011 10:40:40.076882 4790 patch_prober.go:28] interesting pod/console-76f8bc4746-9rjdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" start-of-body= Oct 11 10:40:40.077195 master-0 kubenswrapper[4790]: I1011 10:40:40.076962 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" probeResult="failure" output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" Oct 11 10:40:40.088539 master-0 kubenswrapper[4790]: I1011 10:40:40.088459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-8-master-0" event={"ID":"a3934355-bb61-4316-b164-05294e12906a","Type":"ContainerDied","Data":"4a1e893e4dcbac435d2da0e68c078d559bd07b9d3b598e66849b7e045c462376"} Oct 11 10:40:40.089090 master-0 kubenswrapper[4790]: I1011 10:40:40.088566 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1e893e4dcbac435d2da0e68c078d559bd07b9d3b598e66849b7e045c462376" Oct 11 10:40:40.089090 master-0 kubenswrapper[4790]: I1011 10:40:40.088581 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-8-master-0" Oct 11 10:40:40.090279 master-0 kubenswrapper[4790]: I1011 10:40:40.090217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" event={"ID":"95f1328a-5ab8-4276-9bfd-55b3dbb2a994","Type":"ContainerStarted","Data":"0ce8427bbe6d75f182a5b32d8cd2f22fe40fc718730d32921af86371a9379011"} Oct 11 10:40:40.090279 master-0 kubenswrapper[4790]: I1011 10:40:40.090279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" event={"ID":"95f1328a-5ab8-4276-9bfd-55b3dbb2a994","Type":"ContainerStarted","Data":"c55f42cfe8653fa8706646db8a6223894e0bf53deb5fa86dfa59ef2811cbde3d"} Oct 11 10:40:40.090863 master-0 kubenswrapper[4790]: I1011 10:40:40.090813 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:40.093444 master-0 kubenswrapper[4790]: I1011 10:40:40.093381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c6bb4d3d6bdbb62903356b2987e206d2","Type":"ContainerStarted","Data":"e036684c78a5021fb48c289d54cae789fa1bf5823ee42d9f136d64fecac494b3"} Oct 11 10:40:40.093444 master-0 kubenswrapper[4790]: I1011 10:40:40.093446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c6bb4d3d6bdbb62903356b2987e206d2","Type":"ContainerStarted","Data":"3bafb2f92a85e23ceb69467f8cda564274c1f8c23cdc1567f9df59d0d32f0e95"} Oct 11 10:40:40.093562 master-0 kubenswrapper[4790]: I1011 10:40:40.093460 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c6bb4d3d6bdbb62903356b2987e206d2","Type":"ContainerStarted","Data":"5ae70bae04836aa8a45f8cf884baea6ccb012cd945a396865da15f8d293dd984"} Oct 11 10:40:40.093609 master-0 kubenswrapper[4790]: I1011 10:40:40.093580 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:40:40.096908 master-0 kubenswrapper[4790]: I1011 10:40:40.096869 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" Oct 11 10:40:40.106716 master-0 kubenswrapper[4790]: I1011 10:40:40.106284 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7d46fcc5c6-n88q4" Oct 11 10:40:40.116457 master-0 kubenswrapper[4790]: I1011 10:40:40.116380 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0" podStartSLOduration=1.116359718 podStartE2EDuration="1.116359718s" podCreationTimestamp="2025-10-11 10:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:40:40.114737327 +0000 UTC m=+116.669197629" watchObservedRunningTime="2025-10-11 10:40:40.116359718 +0000 UTC m=+116.670820010" Oct 11 10:40:40.177538 master-0 kubenswrapper[4790]: I1011 10:40:40.177431 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=11.177392941 podStartE2EDuration="11.177392941s" podCreationTimestamp="2025-10-11 10:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:40:40.172865854 +0000 UTC m=+116.727326156" watchObservedRunningTime="2025-10-11 10:40:40.177392941 +0000 UTC m=+116.731853233" Oct 11 10:40:42.107171 master-0 kubenswrapper[4790]: I1011 10:40:42.107062 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796" exitCode=0 Oct 11 10:40:42.108266 master-0 kubenswrapper[4790]: I1011 10:40:42.107502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerDied","Data":"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796"} Oct 11 10:40:43.114938 master-0 kubenswrapper[4790]: I1011 10:40:43.114840 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f" exitCode=0 Oct 11 10:40:43.114938 master-0 kubenswrapper[4790]: I1011 10:40:43.114915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerDied","Data":"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f"} Oct 11 10:40:44.126423 master-0 kubenswrapper[4790]: I1011 10:40:44.126351 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0" exitCode=0 Oct 11 10:40:44.127635 master-0 kubenswrapper[4790]: I1011 10:40:44.126475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerDied","Data":"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0"} Oct 11 10:40:44.819361 master-0 kubenswrapper[4790]: I1011 10:40:44.819244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bn2sv" Oct 11 10:40:45.136408 master-0 kubenswrapper[4790]: I1011 10:40:45.136338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerStarted","Data":"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970"} Oct 11 10:40:46.149331 master-0 kubenswrapper[4790]: I1011 10:40:46.149273 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/0.log" Oct 11 10:40:46.151549 master-0 kubenswrapper[4790]: I1011 10:40:46.151483 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="35dda1dfed46ef2979f5c931d6cb1625edfbd4b3165e5e094d4299e095ab99b2" exitCode=1 Oct 11 10:40:46.151666 master-0 kubenswrapper[4790]: I1011 10:40:46.151545 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerStarted","Data":"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef"} Oct 11 10:40:46.151666 master-0 kubenswrapper[4790]: I1011 10:40:46.151584 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerStarted","Data":"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377"} Oct 11 10:40:46.151666 master-0 kubenswrapper[4790]: I1011 10:40:46.151600 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerDied","Data":"35dda1dfed46ef2979f5c931d6cb1625edfbd4b3165e5e094d4299e095ab99b2"} Oct 11 10:40:46.481909 master-0 kubenswrapper[4790]: I1011 10:40:46.481801 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-0"] Oct 11 10:40:47.176599 master-0 kubenswrapper[4790]: I1011 10:40:47.176456 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/0.log" Oct 11 10:40:47.178817 master-0 kubenswrapper[4790]: I1011 10:40:47.178691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerStarted","Data":"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060"} Oct 11 10:40:47.179751 master-0 kubenswrapper[4790]: I1011 10:40:47.179661 4790 scope.go:117] "RemoveContainer" containerID="35dda1dfed46ef2979f5c931d6cb1625edfbd4b3165e5e094d4299e095ab99b2" Oct 11 10:40:48.181136 master-0 kubenswrapper[4790]: I1011 10:40:48.181022 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:48.181136 master-0 kubenswrapper[4790]: I1011 10:40:48.181094 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:48.181136 master-0 kubenswrapper[4790]: I1011 10:40:48.181112 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:48.181136 master-0 kubenswrapper[4790]: I1011 10:40:48.181133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:48.188884 master-0 kubenswrapper[4790]: I1011 10:40:48.188828 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/1.log" Oct 11 10:40:48.191140 master-0 kubenswrapper[4790]: I1011 10:40:48.191084 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/0.log" Oct 11 10:40:48.193335 master-0 kubenswrapper[4790]: I1011 10:40:48.193274 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" exitCode=1 Oct 11 10:40:48.193335 master-0 kubenswrapper[4790]: I1011 10:40:48.193313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerDied","Data":"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227"} Oct 11 10:40:48.193553 master-0 kubenswrapper[4790]: I1011 10:40:48.193372 4790 scope.go:117] "RemoveContainer" containerID="35dda1dfed46ef2979f5c931d6cb1625edfbd4b3165e5e094d4299e095ab99b2" Oct 11 10:40:48.194335 master-0 kubenswrapper[4790]: I1011 10:40:48.194252 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:40:48.194783 master-0 kubenswrapper[4790]: E1011 10:40:48.194744 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd pod=etcd-master-0_openshift-etcd(a7e53a8977ce5fc5588aef94f91dcc24)\"" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" Oct 11 10:40:49.203006 master-0 kubenswrapper[4790]: I1011 10:40:49.202919 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/1.log" Oct 11 10:40:49.207380 master-0 kubenswrapper[4790]: I1011 10:40:49.207310 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:40:49.207653 master-0 kubenswrapper[4790]: E1011 10:40:49.207601 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd pod=etcd-master-0_openshift-etcd(a7e53a8977ce5fc5588aef94f91dcc24)\"" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" Oct 11 10:40:50.078266 master-0 kubenswrapper[4790]: I1011 10:40:50.078186 4790 patch_prober.go:28] interesting pod/console-76f8bc4746-9rjdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" start-of-body= Oct 11 10:40:50.078593 master-0 kubenswrapper[4790]: I1011 10:40:50.078286 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" probeResult="failure" output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" Oct 11 10:40:51.511049 master-0 kubenswrapper[4790]: I1011 10:40:51.510946 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-guard-master-0"] Oct 11 10:40:51.511975 master-0 kubenswrapper[4790]: E1011 10:40:51.511187 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3934355-bb61-4316-b164-05294e12906a" containerName="installer" Oct 11 10:40:51.511975 master-0 kubenswrapper[4790]: I1011 10:40:51.511209 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3934355-bb61-4316-b164-05294e12906a" containerName="installer" Oct 11 10:40:51.511975 master-0 kubenswrapper[4790]: I1011 10:40:51.511310 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3934355-bb61-4316-b164-05294e12906a" containerName="installer" Oct 11 10:40:51.511975 master-0 kubenswrapper[4790]: I1011 10:40:51.511887 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:40:51.515757 master-0 kubenswrapper[4790]: I1011 10:40:51.515629 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"default-dockercfg-rkxgf" Oct 11 10:40:51.515945 master-0 kubenswrapper[4790]: I1011 10:40:51.515853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"openshift-service-ca.crt" Oct 11 10:40:51.516057 master-0 kubenswrapper[4790]: I1011 10:40:51.515914 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Oct 11 10:40:51.528865 master-0 kubenswrapper[4790]: I1011 10:40:51.528800 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/etcd-guard-master-0"] Oct 11 10:40:51.612526 master-0 kubenswrapper[4790]: I1011 10:40:51.612431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64gl\" (UniqueName: \"kubernetes.io/projected/c6436766-e7b0-471b-acbf-861280191521-kube-api-access-q64gl\") pod \"etcd-guard-master-0\" (UID: \"c6436766-e7b0-471b-acbf-861280191521\") " pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:40:51.714305 master-0 kubenswrapper[4790]: I1011 10:40:51.714156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64gl\" (UniqueName: \"kubernetes.io/projected/c6436766-e7b0-471b-acbf-861280191521-kube-api-access-q64gl\") pod \"etcd-guard-master-0\" (UID: \"c6436766-e7b0-471b-acbf-861280191521\") " pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:40:51.736002 master-0 kubenswrapper[4790]: I1011 10:40:51.735886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64gl\" (UniqueName: \"kubernetes.io/projected/c6436766-e7b0-471b-acbf-861280191521-kube-api-access-q64gl\") pod \"etcd-guard-master-0\" (UID: \"c6436766-e7b0-471b-acbf-861280191521\") " pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:40:51.831989 master-0 kubenswrapper[4790]: I1011 10:40:51.831886 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:40:52.396040 master-0 kubenswrapper[4790]: I1011 10:40:52.395941 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/etcd-guard-master-0"] Oct 11 10:40:52.399752 master-0 kubenswrapper[4790]: W1011 10:40:52.399633 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6436766_e7b0_471b_acbf_861280191521.slice/crio-51c414a844b478d418e57b3713e6576b51b14b364314441dcee769fc9449f46a WatchSource:0}: Error finding container 51c414a844b478d418e57b3713e6576b51b14b364314441dcee769fc9449f46a: Status 404 returned error can't find the container with id 51c414a844b478d418e57b3713e6576b51b14b364314441dcee769fc9449f46a Oct 11 10:40:53.181252 master-0 kubenswrapper[4790]: I1011 10:40:53.181127 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:53.181925 master-0 kubenswrapper[4790]: I1011 10:40:53.181265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:53.182677 master-0 kubenswrapper[4790]: I1011 10:40:53.182628 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:40:53.183254 master-0 kubenswrapper[4790]: E1011 10:40:53.183192 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd pod=etcd-master-0_openshift-etcd(a7e53a8977ce5fc5588aef94f91dcc24)\"" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" Oct 11 10:40:53.232877 master-0 kubenswrapper[4790]: I1011 10:40:53.232770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-guard-master-0" event={"ID":"c6436766-e7b0-471b-acbf-861280191521","Type":"ContainerStarted","Data":"9e98490fd3666e68c05ba349bef300928b07e9009d6f846b655d69140196a8a3"} Oct 11 10:40:53.232877 master-0 kubenswrapper[4790]: I1011 10:40:53.232882 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-guard-master-0" event={"ID":"c6436766-e7b0-471b-acbf-861280191521","Type":"ContainerStarted","Data":"51c414a844b478d418e57b3713e6576b51b14b364314441dcee769fc9449f46a"} Oct 11 10:40:53.233129 master-0 kubenswrapper[4790]: I1011 10:40:53.232956 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:40:53.285097 master-0 kubenswrapper[4790]: I1011 10:40:53.284977 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-guard-master-0" podStartSLOduration=2.28495318 podStartE2EDuration="2.28495318s" podCreationTimestamp="2025-10-11 10:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:40:53.281702526 +0000 UTC m=+129.836162818" watchObservedRunningTime="2025-10-11 10:40:53.28495318 +0000 UTC m=+129.839413492" Oct 11 10:40:58.182275 master-0 kubenswrapper[4790]: I1011 10:40:58.182132 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:58.182275 master-0 kubenswrapper[4790]: I1011 10:40:58.182255 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:40:58.183600 master-0 kubenswrapper[4790]: I1011 10:40:58.183538 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:40:58.233676 master-0 kubenswrapper[4790]: I1011 10:40:58.233579 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:40:58.233944 master-0 kubenswrapper[4790]: I1011 10:40:58.233700 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:40:59.271872 master-0 kubenswrapper[4790]: I1011 10:40:59.271785 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/1.log" Oct 11 10:40:59.275374 master-0 kubenswrapper[4790]: I1011 10:40:59.275309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"a7e53a8977ce5fc5588aef94f91dcc24","Type":"ContainerStarted","Data":"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca"} Oct 11 10:40:59.324595 master-0 kubenswrapper[4790]: I1011 10:40:59.324469 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=22.324447468 podStartE2EDuration="22.324447468s" podCreationTimestamp="2025-10-11 10:40:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:40:59.320788004 +0000 UTC m=+135.875248336" watchObservedRunningTime="2025-10-11 10:40:59.324447468 +0000 UTC m=+135.878907770" Oct 11 10:41:00.079960 master-0 kubenswrapper[4790]: I1011 10:41:00.079818 4790 patch_prober.go:28] interesting pod/console-76f8bc4746-9rjdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" start-of-body= Oct 11 10:41:00.079960 master-0 kubenswrapper[4790]: I1011 10:41:00.079938 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" probeResult="failure" output="Get \"https://10.130.0.14:8443/health\": dial tcp 10.130.0.14:8443: connect: connection refused" Oct 11 10:41:00.689878 master-0 kubenswrapper[4790]: I1011 10:41:00.689773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/etcd-guard-master-0"] Oct 11 10:41:00.690881 master-0 kubenswrapper[4790]: I1011 10:41:00.689990 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 10:41:03.181127 master-0 kubenswrapper[4790]: I1011 10:41:03.180990 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Oct 11 10:41:03.234344 master-0 kubenswrapper[4790]: I1011 10:41:03.234250 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:03.234344 master-0 kubenswrapper[4790]: I1011 10:41:03.234325 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:08.181960 master-0 kubenswrapper[4790]: I1011 10:41:08.181629 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:41:08.234738 master-0 kubenswrapper[4790]: I1011 10:41:08.234530 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:08.234738 master-0 kubenswrapper[4790]: I1011 10:41:08.234724 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:09.182143 master-0 kubenswrapper[4790]: I1011 10:41:09.182035 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:09.183257 master-0 kubenswrapper[4790]: I1011 10:41:09.182158 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:09.669874 master-0 kubenswrapper[4790]: I1011 10:41:09.669806 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:41:09.714494 master-0 kubenswrapper[4790]: I1011 10:41:09.714447 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:41:10.082942 master-0 kubenswrapper[4790]: I1011 10:41:10.082879 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:41:10.087892 master-0 kubenswrapper[4790]: I1011 10:41:10.087841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:41:10.364201 master-0 kubenswrapper[4790]: I1011 10:41:10.364054 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Oct 11 10:41:13.236672 master-0 kubenswrapper[4790]: I1011 10:41:13.235983 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:13.236672 master-0 kubenswrapper[4790]: I1011 10:41:13.236112 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:18.236951 master-0 kubenswrapper[4790]: I1011 10:41:18.236827 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:18.237605 master-0 kubenswrapper[4790]: I1011 10:41:18.237001 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:19.182781 master-0 kubenswrapper[4790]: I1011 10:41:19.182577 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:19.182781 master-0 kubenswrapper[4790]: I1011 10:41:19.182772 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:23.238668 master-0 kubenswrapper[4790]: I1011 10:41:23.238542 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:23.238668 master-0 kubenswrapper[4790]: I1011 10:41:23.238639 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:28.239779 master-0 kubenswrapper[4790]: I1011 10:41:28.239627 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:28.240975 master-0 kubenswrapper[4790]: I1011 10:41:28.239795 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:29.182931 master-0 kubenswrapper[4790]: I1011 10:41:29.182838 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:41:29.183466 master-0 kubenswrapper[4790]: I1011 10:41:29.182978 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:41:29.799109 master-0 kubenswrapper[4790]: I1011 10:41:29.799021 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:41:30.181444 master-0 kubenswrapper[4790]: I1011 10:41:30.181206 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Oct 11 10:41:38.195976 master-0 kubenswrapper[4790]: I1011 10:41:38.195905 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Oct 11 10:41:38.209574 master-0 kubenswrapper[4790]: I1011 10:41:38.209524 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Oct 11 10:41:46.394085 master-0 kubenswrapper[4790]: I1011 10:41:46.393968 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76f8bc4746-9rjdm"] Oct 11 10:42:04.487369 master-0 kubenswrapper[4790]: I1011 10:42:04.487256 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-6-master-0"] Oct 11 10:42:04.488386 master-0 kubenswrapper[4790]: I1011 10:42:04.488220 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.491733 master-0 kubenswrapper[4790]: I1011 10:42:04.491616 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-djvlq" Oct 11 10:42:04.492703 master-0 kubenswrapper[4790]: I1011 10:42:04.492595 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 11 10:42:04.505555 master-0 kubenswrapper[4790]: I1011 10:42:04.505492 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-master-0"] Oct 11 10:42:04.678490 master-0 kubenswrapper[4790]: I1011 10:42:04.678392 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.678490 master-0 kubenswrapper[4790]: I1011 10:42:04.678508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2957c2-bc3c-4399-b508-37a1a7689108-kube-api-access\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.678917 master-0 kubenswrapper[4790]: I1011 10:42:04.678548 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-var-lock\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.779628 master-0 kubenswrapper[4790]: I1011 10:42:04.779408 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2957c2-bc3c-4399-b508-37a1a7689108-kube-api-access\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.779628 master-0 kubenswrapper[4790]: I1011 10:42:04.779482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-var-lock\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.779628 master-0 kubenswrapper[4790]: I1011 10:42:04.779540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.779628 master-0 kubenswrapper[4790]: I1011 10:42:04.779630 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.780289 master-0 kubenswrapper[4790]: I1011 10:42:04.779973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-var-lock\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.804239 master-0 kubenswrapper[4790]: I1011 10:42:04.804148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2957c2-bc3c-4399-b508-37a1a7689108-kube-api-access\") pod \"installer-6-master-0\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:04.815258 master-0 kubenswrapper[4790]: I1011 10:42:04.815189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:05.333850 master-0 kubenswrapper[4790]: W1011 10:42:05.333754 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3d2957c2_bc3c_4399_b508_37a1a7689108.slice/crio-93e88f1fafa68d434978386be2c8ac316fefc0276f3cdc7d47fedf2f6e452356 WatchSource:0}: Error finding container 93e88f1fafa68d434978386be2c8ac316fefc0276f3cdc7d47fedf2f6e452356: Status 404 returned error can't find the container with id 93e88f1fafa68d434978386be2c8ac316fefc0276f3cdc7d47fedf2f6e452356 Oct 11 10:42:05.406937 master-0 kubenswrapper[4790]: I1011 10:42:05.406828 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-master-0"] Oct 11 10:42:05.634596 master-0 kubenswrapper[4790]: I1011 10:42:05.634439 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"3d2957c2-bc3c-4399-b508-37a1a7689108","Type":"ContainerStarted","Data":"93e88f1fafa68d434978386be2c8ac316fefc0276f3cdc7d47fedf2f6e452356"} Oct 11 10:42:08.320577 master-0 kubenswrapper[4790]: I1011 10:42:08.320324 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Oct 11 10:42:08.321306 master-0 kubenswrapper[4790]: I1011 10:42:08.321133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.324297 master-0 kubenswrapper[4790]: I1011 10:42:08.324120 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-t28rg" Oct 11 10:42:08.324482 master-0 kubenswrapper[4790]: I1011 10:42:08.324454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 11 10:42:08.332140 master-0 kubenswrapper[4790]: I1011 10:42:08.332087 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Oct 11 10:42:08.441498 master-0 kubenswrapper[4790]: I1011 10:42:08.441463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-var-lock\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.441694 master-0 kubenswrapper[4790]: I1011 10:42:08.441565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b44d0e-0afa-47db-a215-114b99006a12-kube-api-access\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.441788 master-0 kubenswrapper[4790]: I1011 10:42:08.441743 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.543295 master-0 kubenswrapper[4790]: I1011 10:42:08.543080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.543295 master-0 kubenswrapper[4790]: I1011 10:42:08.543269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-var-lock\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.543846 master-0 kubenswrapper[4790]: I1011 10:42:08.543314 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-var-lock\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.543846 master-0 kubenswrapper[4790]: I1011 10:42:08.543349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b44d0e-0afa-47db-a215-114b99006a12-kube-api-access\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.543846 master-0 kubenswrapper[4790]: I1011 10:42:08.543266 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.567800 master-0 kubenswrapper[4790]: I1011 10:42:08.566675 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b44d0e-0afa-47db-a215-114b99006a12-kube-api-access\") pod \"installer-5-master-0\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.661186 master-0 kubenswrapper[4790]: I1011 10:42:08.660973 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"3d2957c2-bc3c-4399-b508-37a1a7689108","Type":"ContainerStarted","Data":"2d6f8b55cb9d1ca99486dd5352b34512a263c62c8f4e3533fc79b7394d55c8e0"} Oct 11 10:42:08.691333 master-0 kubenswrapper[4790]: I1011 10:42:08.691196 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:08.691570 master-0 kubenswrapper[4790]: I1011 10:42:08.691450 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-6-master-0" podStartSLOduration=1.9387448539999999 podStartE2EDuration="4.691422536s" podCreationTimestamp="2025-10-11 10:42:04 +0000 UTC" firstStartedPulling="2025-10-11 10:42:05.337185532 +0000 UTC m=+201.891645854" lastFinishedPulling="2025-10-11 10:42:08.089863244 +0000 UTC m=+204.644323536" observedRunningTime="2025-10-11 10:42:08.688511588 +0000 UTC m=+205.242971910" watchObservedRunningTime="2025-10-11 10:42:08.691422536 +0000 UTC m=+205.245882838" Oct 11 10:42:09.289080 master-0 kubenswrapper[4790]: W1011 10:42:09.288983 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb5b44d0e_0afa_47db_a215_114b99006a12.slice/crio-b2f3fc67df160d3adfc50304aaf54cdfff608931c2d0a54d0ec7908bd3846999 WatchSource:0}: Error finding container b2f3fc67df160d3adfc50304aaf54cdfff608931c2d0a54d0ec7908bd3846999: Status 404 returned error can't find the container with id b2f3fc67df160d3adfc50304aaf54cdfff608931c2d0a54d0ec7908bd3846999 Oct 11 10:42:09.309792 master-0 kubenswrapper[4790]: I1011 10:42:09.309681 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Oct 11 10:42:09.668144 master-0 kubenswrapper[4790]: I1011 10:42:09.668015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"b5b44d0e-0afa-47db-a215-114b99006a12","Type":"ContainerStarted","Data":"4be921d32cc962b169c97845895b27ff57f7dd21aaaefe6e16cf013e922ab5ec"} Oct 11 10:42:09.668144 master-0 kubenswrapper[4790]: I1011 10:42:09.668086 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"b5b44d0e-0afa-47db-a215-114b99006a12","Type":"ContainerStarted","Data":"b2f3fc67df160d3adfc50304aaf54cdfff608931c2d0a54d0ec7908bd3846999"} Oct 11 10:42:09.719554 master-0 kubenswrapper[4790]: I1011 10:42:09.719457 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=1.7194096509999999 podStartE2EDuration="1.719409651s" podCreationTimestamp="2025-10-11 10:42:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:42:09.716325778 +0000 UTC m=+206.270786070" watchObservedRunningTime="2025-10-11 10:42:09.719409651 +0000 UTC m=+206.273869963" Oct 11 10:42:11.434990 master-0 kubenswrapper[4790]: I1011 10:42:11.434868 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-76f8bc4746-9rjdm" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" containerID="cri-o://199044355c2ca69bb6ac6dacb80814ae2bdf5a58183730d9a6202cd23bded675" gracePeriod=15 Oct 11 10:42:11.680095 master-0 kubenswrapper[4790]: I1011 10:42:11.679999 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76f8bc4746-9rjdm_ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db/console/0.log" Oct 11 10:42:11.680346 master-0 kubenswrapper[4790]: I1011 10:42:11.680099 4790 generic.go:334] "Generic (PLEG): container finished" podID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerID="199044355c2ca69bb6ac6dacb80814ae2bdf5a58183730d9a6202cd23bded675" exitCode=2 Oct 11 10:42:11.680346 master-0 kubenswrapper[4790]: I1011 10:42:11.680155 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f8bc4746-9rjdm" event={"ID":"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db","Type":"ContainerDied","Data":"199044355c2ca69bb6ac6dacb80814ae2bdf5a58183730d9a6202cd23bded675"} Oct 11 10:42:11.908871 master-0 kubenswrapper[4790]: I1011 10:42:11.908787 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76f8bc4746-9rjdm_ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db/console/0.log" Oct 11 10:42:11.908989 master-0 kubenswrapper[4790]: I1011 10:42:11.908921 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:42:12.087364 master-0 kubenswrapper[4790]: I1011 10:42:12.087223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-service-ca\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.087364 master-0 kubenswrapper[4790]: I1011 10:42:12.087316 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-trusted-ca-bundle\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.087364 master-0 kubenswrapper[4790]: I1011 10:42:12.087354 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ljnp\" (UniqueName: \"kubernetes.io/projected/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-kube-api-access-7ljnp\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.087364 master-0 kubenswrapper[4790]: I1011 10:42:12.087386 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-serving-cert\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.088048 master-0 kubenswrapper[4790]: I1011 10:42:12.087409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-oauth-config\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.088048 master-0 kubenswrapper[4790]: I1011 10:42:12.087450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-oauth-serving-cert\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.088048 master-0 kubenswrapper[4790]: I1011 10:42:12.087523 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-config\") pod \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\" (UID: \"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db\") " Oct 11 10:42:12.088048 master-0 kubenswrapper[4790]: I1011 10:42:12.087960 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-service-ca" (OuterVolumeSpecName: "service-ca") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:42:12.088786 master-0 kubenswrapper[4790]: I1011 10:42:12.088680 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-config" (OuterVolumeSpecName: "console-config") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:42:12.088928 master-0 kubenswrapper[4790]: I1011 10:42:12.088770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:42:12.089013 master-0 kubenswrapper[4790]: I1011 10:42:12.088903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:42:12.092178 master-0 kubenswrapper[4790]: I1011 10:42:12.092102 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:42:12.094291 master-0 kubenswrapper[4790]: I1011 10:42:12.094230 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-kube-api-access-7ljnp" (OuterVolumeSpecName: "kube-api-access-7ljnp") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "kube-api-access-7ljnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:42:12.094291 master-0 kubenswrapper[4790]: I1011 10:42:12.094219 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" (UID: "ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:42:12.188818 master-0 kubenswrapper[4790]: I1011 10:42:12.188620 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.188818 master-0 kubenswrapper[4790]: I1011 10:42:12.188756 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-service-ca\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.188818 master-0 kubenswrapper[4790]: I1011 10:42:12.188787 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.188818 master-0 kubenswrapper[4790]: I1011 10:42:12.188820 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ljnp\" (UniqueName: \"kubernetes.io/projected/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-kube-api-access-7ljnp\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.189200 master-0 kubenswrapper[4790]: I1011 10:42:12.188846 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.189200 master-0 kubenswrapper[4790]: I1011 10:42:12.188866 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.189200 master-0 kubenswrapper[4790]: I1011 10:42:12.188884 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:12.689277 master-0 kubenswrapper[4790]: I1011 10:42:12.689197 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76f8bc4746-9rjdm_ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db/console/0.log" Oct 11 10:42:12.689277 master-0 kubenswrapper[4790]: I1011 10:42:12.689266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f8bc4746-9rjdm" event={"ID":"ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db","Type":"ContainerDied","Data":"f61d3e1404aecae93207fe8056d63854d603a520f23437c58f022239f1436a77"} Oct 11 10:42:12.690307 master-0 kubenswrapper[4790]: I1011 10:42:12.689338 4790 scope.go:117] "RemoveContainer" containerID="199044355c2ca69bb6ac6dacb80814ae2bdf5a58183730d9a6202cd23bded675" Oct 11 10:42:12.690307 master-0 kubenswrapper[4790]: I1011 10:42:12.689439 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f8bc4746-9rjdm" Oct 11 10:42:12.722795 master-0 kubenswrapper[4790]: I1011 10:42:12.722686 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76f8bc4746-9rjdm"] Oct 11 10:42:12.737228 master-0 kubenswrapper[4790]: I1011 10:42:12.737136 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76f8bc4746-9rjdm"] Oct 11 10:42:13.633603 master-0 kubenswrapper[4790]: I1011 10:42:13.633501 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-69df5d46bc-wjtq5"] Oct 11 10:42:13.634210 master-0 kubenswrapper[4790]: I1011 10:42:13.634133 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" containerID="cri-o://29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a" gracePeriod=120 Oct 11 10:42:13.634306 master-0 kubenswrapper[4790]: I1011 10:42:13.634186 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef" gracePeriod=120 Oct 11 10:42:14.301081 master-0 kubenswrapper[4790]: I1011 10:42:14.301010 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" path="/var/lib/kubelet/pods/ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db/volumes" Oct 11 10:42:14.335423 master-0 kubenswrapper[4790]: I1011 10:42:14.335350 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Oct 11 10:42:14.336046 master-0 kubenswrapper[4790]: I1011 10:42:14.335928 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-5-master-0" podUID="b5b44d0e-0afa-47db-a215-114b99006a12" containerName="installer" containerID="cri-o://4be921d32cc962b169c97845895b27ff57f7dd21aaaefe6e16cf013e922ab5ec" gracePeriod=30 Oct 11 10:42:14.708622 master-0 kubenswrapper[4790]: I1011 10:42:14.708445 4790 generic.go:334] "Generic (PLEG): container finished" podID="099ca022-6e9c-4604-b517-d90713dd6a44" containerID="dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef" exitCode=0 Oct 11 10:42:14.708622 master-0 kubenswrapper[4790]: I1011 10:42:14.708538 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerDied","Data":"dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef"} Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: I1011 10:42:15.075973 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:15.076101 master-0 kubenswrapper[4790]: I1011 10:42:15.076092 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:19.718766 master-0 kubenswrapper[4790]: I1011 10:42:19.718636 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Oct 11 10:42:19.719583 master-0 kubenswrapper[4790]: E1011 10:42:19.718981 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" Oct 11 10:42:19.719583 master-0 kubenswrapper[4790]: I1011 10:42:19.719012 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" Oct 11 10:42:19.719583 master-0 kubenswrapper[4790]: I1011 10:42:19.719172 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7a5bf0-d3df-49f7-bd97-a7b9425fe9db" containerName="console" Oct 11 10:42:19.720280 master-0 kubenswrapper[4790]: I1011 10:42:19.720052 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.735804 master-0 kubenswrapper[4790]: I1011 10:42:19.735729 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Oct 11 10:42:19.796145 master-0 kubenswrapper[4790]: I1011 10:42:19.796001 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-var-lock\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.796145 master-0 kubenswrapper[4790]: I1011 10:42:19.796150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.796438 master-0 kubenswrapper[4790]: I1011 10:42:19.796191 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7063ccc-c150-41d0-9285-8a8ca00aa417-kube-api-access\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.897480 master-0 kubenswrapper[4790]: I1011 10:42:19.897371 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7063ccc-c150-41d0-9285-8a8ca00aa417-kube-api-access\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.897845 master-0 kubenswrapper[4790]: I1011 10:42:19.897519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-var-lock\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.897845 master-0 kubenswrapper[4790]: I1011 10:42:19.897592 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.897845 master-0 kubenswrapper[4790]: I1011 10:42:19.897752 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.898018 master-0 kubenswrapper[4790]: I1011 10:42:19.897893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-var-lock\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:19.924507 master-0 kubenswrapper[4790]: I1011 10:42:19.924212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7063ccc-c150-41d0-9285-8a8ca00aa417-kube-api-access\") pod \"installer-6-master-0\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:20.045233 master-0 kubenswrapper[4790]: I1011 10:42:20.045018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: I1011 10:42:20.068397 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:20.068607 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:20.069287 master-0 kubenswrapper[4790]: I1011 10:42:20.068650 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:20.282966 master-0 kubenswrapper[4790]: I1011 10:42:20.282875 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Oct 11 10:42:20.752920 master-0 kubenswrapper[4790]: I1011 10:42:20.752849 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"e7063ccc-c150-41d0-9285-8a8ca00aa417","Type":"ContainerStarted","Data":"194353fb7acfeb121812b2d62c7722c179dced595ba3e814ace7d8070862578b"} Oct 11 10:42:20.754041 master-0 kubenswrapper[4790]: I1011 10:42:20.754003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"e7063ccc-c150-41d0-9285-8a8ca00aa417","Type":"ContainerStarted","Data":"df3c31d752a92f830ac660f11dc711746fedff638b520cb70b6e043fe897e4d1"} Oct 11 10:42:20.781106 master-0 kubenswrapper[4790]: I1011 10:42:20.780983 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-0" podStartSLOduration=1.780951299 podStartE2EDuration="1.780951299s" podCreationTimestamp="2025-10-11 10:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:42:20.776851076 +0000 UTC m=+217.331311428" watchObservedRunningTime="2025-10-11 10:42:20.780951299 +0000 UTC m=+217.335411631" Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: I1011 10:42:25.069874 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:25.070019 master-0 kubenswrapper[4790]: I1011 10:42:25.070013 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:25.072142 master-0 kubenswrapper[4790]: I1011 10:42:25.070204 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: I1011 10:42:30.066368 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:30.066477 master-0 kubenswrapper[4790]: I1011 10:42:30.066476 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: I1011 10:42:35.071268 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:35.071412 master-0 kubenswrapper[4790]: I1011 10:42:35.071397 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: I1011 10:42:40.064812 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:40.064989 master-0 kubenswrapper[4790]: I1011 10:42:40.064920 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:40.870983 master-0 kubenswrapper[4790]: I1011 10:42:40.870909 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_b5b44d0e-0afa-47db-a215-114b99006a12/installer/0.log" Oct 11 10:42:40.870983 master-0 kubenswrapper[4790]: I1011 10:42:40.870976 4790 generic.go:334] "Generic (PLEG): container finished" podID="b5b44d0e-0afa-47db-a215-114b99006a12" containerID="4be921d32cc962b169c97845895b27ff57f7dd21aaaefe6e16cf013e922ab5ec" exitCode=1 Oct 11 10:42:40.871309 master-0 kubenswrapper[4790]: I1011 10:42:40.871016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"b5b44d0e-0afa-47db-a215-114b99006a12","Type":"ContainerDied","Data":"4be921d32cc962b169c97845895b27ff57f7dd21aaaefe6e16cf013e922ab5ec"} Oct 11 10:42:40.932361 master-0 kubenswrapper[4790]: I1011 10:42:40.932280 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_b5b44d0e-0afa-47db-a215-114b99006a12/installer/0.log" Oct 11 10:42:40.932361 master-0 kubenswrapper[4790]: I1011 10:42:40.932368 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:41.075200 master-0 kubenswrapper[4790]: I1011 10:42:41.075030 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-kubelet-dir\") pod \"b5b44d0e-0afa-47db-a215-114b99006a12\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " Oct 11 10:42:41.075200 master-0 kubenswrapper[4790]: I1011 10:42:41.075188 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b44d0e-0afa-47db-a215-114b99006a12-kube-api-access\") pod \"b5b44d0e-0afa-47db-a215-114b99006a12\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " Oct 11 10:42:41.076196 master-0 kubenswrapper[4790]: I1011 10:42:41.075267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b5b44d0e-0afa-47db-a215-114b99006a12" (UID: "b5b44d0e-0afa-47db-a215-114b99006a12"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:42:41.076196 master-0 kubenswrapper[4790]: I1011 10:42:41.075312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-var-lock\") pod \"b5b44d0e-0afa-47db-a215-114b99006a12\" (UID: \"b5b44d0e-0afa-47db-a215-114b99006a12\") " Oct 11 10:42:41.076196 master-0 kubenswrapper[4790]: I1011 10:42:41.075500 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-var-lock" (OuterVolumeSpecName: "var-lock") pod "b5b44d0e-0afa-47db-a215-114b99006a12" (UID: "b5b44d0e-0afa-47db-a215-114b99006a12"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:42:41.076196 master-0 kubenswrapper[4790]: I1011 10:42:41.075770 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:41.076196 master-0 kubenswrapper[4790]: I1011 10:42:41.075813 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5b44d0e-0afa-47db-a215-114b99006a12-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:41.078039 master-0 kubenswrapper[4790]: I1011 10:42:41.077966 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b44d0e-0afa-47db-a215-114b99006a12-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b5b44d0e-0afa-47db-a215-114b99006a12" (UID: "b5b44d0e-0afa-47db-a215-114b99006a12"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:42:41.176681 master-0 kubenswrapper[4790]: I1011 10:42:41.176523 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5b44d0e-0afa-47db-a215-114b99006a12-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:41.573518 master-0 kubenswrapper[4790]: I1011 10:42:41.573410 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Oct 11 10:42:41.573959 master-0 kubenswrapper[4790]: E1011 10:42:41.573606 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b44d0e-0afa-47db-a215-114b99006a12" containerName="installer" Oct 11 10:42:41.573959 master-0 kubenswrapper[4790]: I1011 10:42:41.573622 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b44d0e-0afa-47db-a215-114b99006a12" containerName="installer" Oct 11 10:42:41.573959 master-0 kubenswrapper[4790]: I1011 10:42:41.573691 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b44d0e-0afa-47db-a215-114b99006a12" containerName="installer" Oct 11 10:42:41.574508 master-0 kubenswrapper[4790]: I1011 10:42:41.574469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.608072 master-0 kubenswrapper[4790]: I1011 10:42:41.608010 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Oct 11 10:42:41.682886 master-0 kubenswrapper[4790]: I1011 10:42:41.682803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/578baefcb284ee2ba6604eeb80ded917-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"578baefcb284ee2ba6604eeb80ded917\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.683154 master-0 kubenswrapper[4790]: I1011 10:42:41.682901 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/578baefcb284ee2ba6604eeb80ded917-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"578baefcb284ee2ba6604eeb80ded917\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.783903 master-0 kubenswrapper[4790]: I1011 10:42:41.783843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/578baefcb284ee2ba6604eeb80ded917-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"578baefcb284ee2ba6604eeb80ded917\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.783994 master-0 kubenswrapper[4790]: I1011 10:42:41.783948 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/578baefcb284ee2ba6604eeb80ded917-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"578baefcb284ee2ba6604eeb80ded917\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.784030 master-0 kubenswrapper[4790]: I1011 10:42:41.783969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/578baefcb284ee2ba6604eeb80ded917-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"578baefcb284ee2ba6604eeb80ded917\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.784106 master-0 kubenswrapper[4790]: I1011 10:42:41.784005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/578baefcb284ee2ba6604eeb80ded917-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"578baefcb284ee2ba6604eeb80ded917\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.879253 master-0 kubenswrapper[4790]: I1011 10:42:41.879067 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_b5b44d0e-0afa-47db-a215-114b99006a12/installer/0.log" Oct 11 10:42:41.879253 master-0 kubenswrapper[4790]: I1011 10:42:41.879246 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Oct 11 10:42:41.879550 master-0 kubenswrapper[4790]: I1011 10:42:41.879253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"b5b44d0e-0afa-47db-a215-114b99006a12","Type":"ContainerDied","Data":"b2f3fc67df160d3adfc50304aaf54cdfff608931c2d0a54d0ec7908bd3846999"} Oct 11 10:42:41.879550 master-0 kubenswrapper[4790]: I1011 10:42:41.879386 4790 scope.go:117] "RemoveContainer" containerID="4be921d32cc962b169c97845895b27ff57f7dd21aaaefe6e16cf013e922ab5ec" Oct 11 10:42:41.881336 master-0 kubenswrapper[4790]: I1011 10:42:41.881302 4790 generic.go:334] "Generic (PLEG): container finished" podID="3d2957c2-bc3c-4399-b508-37a1a7689108" containerID="2d6f8b55cb9d1ca99486dd5352b34512a263c62c8f4e3533fc79b7394d55c8e0" exitCode=0 Oct 11 10:42:41.881414 master-0 kubenswrapper[4790]: I1011 10:42:41.881346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"3d2957c2-bc3c-4399-b508-37a1a7689108","Type":"ContainerDied","Data":"2d6f8b55cb9d1ca99486dd5352b34512a263c62c8f4e3533fc79b7394d55c8e0"} Oct 11 10:42:41.906345 master-0 kubenswrapper[4790]: I1011 10:42:41.906277 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:41.925229 master-0 kubenswrapper[4790]: I1011 10:42:41.925162 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Oct 11 10:42:41.933252 master-0 kubenswrapper[4790]: I1011 10:42:41.933204 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Oct 11 10:42:42.300082 master-0 kubenswrapper[4790]: I1011 10:42:42.299962 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b44d0e-0afa-47db-a215-114b99006a12" path="/var/lib/kubelet/pods/b5b44d0e-0afa-47db-a215-114b99006a12/volumes" Oct 11 10:42:42.889819 master-0 kubenswrapper[4790]: I1011 10:42:42.889691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"578baefcb284ee2ba6604eeb80ded917","Type":"ContainerStarted","Data":"7ea37e7e34cad88e82c46b5464822e5877d8a824b39f1c0da3fb1426b367c1f3"} Oct 11 10:42:42.889819 master-0 kubenswrapper[4790]: I1011 10:42:42.889807 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"578baefcb284ee2ba6604eeb80ded917","Type":"ContainerStarted","Data":"b8775b7a9049a31b31656c7e341d2282e59cafc06361cc9afd7caf5eb1efcbec"} Oct 11 10:42:43.212405 master-0 kubenswrapper[4790]: I1011 10:42:43.212350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:43.402978 master-0 kubenswrapper[4790]: I1011 10:42:43.402874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-var-lock\") pod \"3d2957c2-bc3c-4399-b508-37a1a7689108\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " Oct 11 10:42:43.403642 master-0 kubenswrapper[4790]: I1011 10:42:43.403031 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2957c2-bc3c-4399-b508-37a1a7689108-kube-api-access\") pod \"3d2957c2-bc3c-4399-b508-37a1a7689108\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " Oct 11 10:42:43.403642 master-0 kubenswrapper[4790]: I1011 10:42:43.403073 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-kubelet-dir\") pod \"3d2957c2-bc3c-4399-b508-37a1a7689108\" (UID: \"3d2957c2-bc3c-4399-b508-37a1a7689108\") " Oct 11 10:42:43.403642 master-0 kubenswrapper[4790]: I1011 10:42:43.403220 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-var-lock" (OuterVolumeSpecName: "var-lock") pod "3d2957c2-bc3c-4399-b508-37a1a7689108" (UID: "3d2957c2-bc3c-4399-b508-37a1a7689108"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:42:43.403642 master-0 kubenswrapper[4790]: I1011 10:42:43.403373 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3d2957c2-bc3c-4399-b508-37a1a7689108" (UID: "3d2957c2-bc3c-4399-b508-37a1a7689108"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:42:43.403839 master-0 kubenswrapper[4790]: I1011 10:42:43.403652 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:43.403839 master-0 kubenswrapper[4790]: I1011 10:42:43.403697 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3d2957c2-bc3c-4399-b508-37a1a7689108-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:43.407204 master-0 kubenswrapper[4790]: I1011 10:42:43.407109 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2957c2-bc3c-4399-b508-37a1a7689108-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3d2957c2-bc3c-4399-b508-37a1a7689108" (UID: "3d2957c2-bc3c-4399-b508-37a1a7689108"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:42:43.505751 master-0 kubenswrapper[4790]: I1011 10:42:43.505480 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2957c2-bc3c-4399-b508-37a1a7689108-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:42:43.901462 master-0 kubenswrapper[4790]: I1011 10:42:43.901388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"3d2957c2-bc3c-4399-b508-37a1a7689108","Type":"ContainerDied","Data":"93e88f1fafa68d434978386be2c8ac316fefc0276f3cdc7d47fedf2f6e452356"} Oct 11 10:42:43.901462 master-0 kubenswrapper[4790]: I1011 10:42:43.901456 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93e88f1fafa68d434978386be2c8ac316fefc0276f3cdc7d47fedf2f6e452356" Oct 11 10:42:43.901794 master-0 kubenswrapper[4790]: I1011 10:42:43.901613 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Oct 11 10:42:44.908356 master-0 kubenswrapper[4790]: I1011 10:42:44.908264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"578baefcb284ee2ba6604eeb80ded917","Type":"ContainerStarted","Data":"2be0c8bedda4d85d18238774ea882f88e570b3cd1131b154fffcc12ee22bff8d"} Oct 11 10:42:44.908356 master-0 kubenswrapper[4790]: I1011 10:42:44.908349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"578baefcb284ee2ba6604eeb80ded917","Type":"ContainerStarted","Data":"3fea5f6bf2cbad1b08cfa0cc896012dcfddaab4355ed829164961df59c1af634"} Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: I1011 10:42:45.067077 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:45.067186 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:45.068964 master-0 kubenswrapper[4790]: I1011 10:42:45.067205 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:45.919430 master-0 kubenswrapper[4790]: I1011 10:42:45.919334 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"578baefcb284ee2ba6604eeb80ded917","Type":"ContainerStarted","Data":"24ac76c410545bd348677bcc077f677b7d9cf32627171ba56638faa2777c4159"} Oct 11 10:42:45.949280 master-0 kubenswrapper[4790]: I1011 10:42:45.949181 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=4.949155214 podStartE2EDuration="4.949155214s" podCreationTimestamp="2025-10-11 10:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:42:45.94699393 +0000 UTC m=+242.501454262" watchObservedRunningTime="2025-10-11 10:42:45.949155214 +0000 UTC m=+242.503615536" Oct 11 10:42:47.945524 master-0 kubenswrapper[4790]: I1011 10:42:47.945398 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-0"] Oct 11 10:42:47.946548 master-0 kubenswrapper[4790]: E1011 10:42:47.945674 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2957c2-bc3c-4399-b508-37a1a7689108" containerName="installer" Oct 11 10:42:47.946548 master-0 kubenswrapper[4790]: I1011 10:42:47.945738 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2957c2-bc3c-4399-b508-37a1a7689108" containerName="installer" Oct 11 10:42:47.946548 master-0 kubenswrapper[4790]: I1011 10:42:47.945887 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2957c2-bc3c-4399-b508-37a1a7689108" containerName="installer" Oct 11 10:42:47.946834 master-0 kubenswrapper[4790]: I1011 10:42:47.946666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:47.949871 master-0 kubenswrapper[4790]: I1011 10:42:47.949745 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 11 10:42:47.949871 master-0 kubenswrapper[4790]: I1011 10:42:47.949849 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"openshift-service-ca.crt" Oct 11 10:42:47.950324 master-0 kubenswrapper[4790]: I1011 10:42:47.949849 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"default-dockercfg-rp764" Oct 11 10:42:47.959878 master-0 kubenswrapper[4790]: I1011 10:42:47.959341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-0"] Oct 11 10:42:47.971207 master-0 kubenswrapper[4790]: I1011 10:42:47.971127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5c6v\" (UniqueName: \"kubernetes.io/projected/8f0d8196-2e0b-479b-ba9a-3e65cb92e046-kube-api-access-v5c6v\") pod \"kube-controller-manager-guard-master-0\" (UID: \"8f0d8196-2e0b-479b-ba9a-3e65cb92e046\") " pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:48.072751 master-0 kubenswrapper[4790]: I1011 10:42:48.072625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5c6v\" (UniqueName: \"kubernetes.io/projected/8f0d8196-2e0b-479b-ba9a-3e65cb92e046-kube-api-access-v5c6v\") pod \"kube-controller-manager-guard-master-0\" (UID: \"8f0d8196-2e0b-479b-ba9a-3e65cb92e046\") " pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:48.098353 master-0 kubenswrapper[4790]: I1011 10:42:48.098207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5c6v\" (UniqueName: \"kubernetes.io/projected/8f0d8196-2e0b-479b-ba9a-3e65cb92e046-kube-api-access-v5c6v\") pod \"kube-controller-manager-guard-master-0\" (UID: \"8f0d8196-2e0b-479b-ba9a-3e65cb92e046\") " pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:48.265296 master-0 kubenswrapper[4790]: I1011 10:42:48.265073 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:48.874586 master-0 kubenswrapper[4790]: I1011 10:42:48.874486 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-0"] Oct 11 10:42:48.944404 master-0 kubenswrapper[4790]: I1011 10:42:48.944340 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" event={"ID":"8f0d8196-2e0b-479b-ba9a-3e65cb92e046","Type":"ContainerStarted","Data":"2bc3521fea3b11a800c4cf500fec36b79a21e18921be0bc8a392dace7631f227"} Oct 11 10:42:49.955471 master-0 kubenswrapper[4790]: I1011 10:42:49.955337 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" event={"ID":"8f0d8196-2e0b-479b-ba9a-3e65cb92e046","Type":"ContainerStarted","Data":"26eb134a4d3360d9de34a2d04a9c99747fb6edc563c05fdf173883431831033b"} Oct 11 10:42:49.956613 master-0 kubenswrapper[4790]: I1011 10:42:49.955898 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:49.962937 master-0 kubenswrapper[4790]: I1011 10:42:49.962854 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" Oct 11 10:42:49.982440 master-0 kubenswrapper[4790]: I1011 10:42:49.982240 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-0" podStartSLOduration=2.982198491 podStartE2EDuration="2.982198491s" podCreationTimestamp="2025-10-11 10:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:42:49.979919802 +0000 UTC m=+246.534380184" watchObservedRunningTime="2025-10-11 10:42:49.982198491 +0000 UTC m=+246.536658813" Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: I1011 10:42:50.063692 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:50.063788 master-0 kubenswrapper[4790]: I1011 10:42:50.063791 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:51.907191 master-0 kubenswrapper[4790]: I1011 10:42:51.907111 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:51.908339 master-0 kubenswrapper[4790]: I1011 10:42:51.907998 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:51.908339 master-0 kubenswrapper[4790]: I1011 10:42:51.908068 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:51.908339 master-0 kubenswrapper[4790]: I1011 10:42:51.908099 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:51.913354 master-0 kubenswrapper[4790]: I1011 10:42:51.913281 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:51.915986 master-0 kubenswrapper[4790]: I1011 10:42:51.915931 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:52.979754 master-0 kubenswrapper[4790]: I1011 10:42:52.979613 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: I1011 10:42:55.067694 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:42:55.067773 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:42:55.068854 master-0 kubenswrapper[4790]: I1011 10:42:55.068817 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:42:58.347909 master-0 kubenswrapper[4790]: I1011 10:42:58.347697 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-0"] Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: I1011 10:43:00.072444 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:43:00.072577 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:43:00.074301 master-0 kubenswrapper[4790]: I1011 10:43:00.072584 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:43:01.921561 master-0 kubenswrapper[4790]: I1011 10:43:01.921483 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Oct 11 10:43:05.064237 master-0 kubenswrapper[4790]: I1011 10:43:05.064133 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:05.065155 master-0 kubenswrapper[4790]: I1011 10:43:05.064522 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:10.059212 master-0 kubenswrapper[4790]: I1011 10:43:10.059090 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:10.059212 master-0 kubenswrapper[4790]: I1011 10:43:10.059197 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:15.059569 master-0 kubenswrapper[4790]: I1011 10:43:15.059415 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:15.059569 master-0 kubenswrapper[4790]: I1011 10:43:15.059551 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:18.933895 master-0 kubenswrapper[4790]: I1011 10:43:18.933791 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Oct 11 10:43:18.936099 master-0 kubenswrapper[4790]: I1011 10:43:18.936042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:18.981925 master-0 kubenswrapper[4790]: I1011 10:43:18.981810 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Oct 11 10:43:18.984672 master-0 kubenswrapper[4790]: I1011 10:43:18.984582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:18.984848 master-0 kubenswrapper[4790]: I1011 10:43:18.984786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:18.984940 master-0 kubenswrapper[4790]: I1011 10:43:18.984845 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.086657 master-0 kubenswrapper[4790]: I1011 10:43:19.086519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.086657 master-0 kubenswrapper[4790]: I1011 10:43:19.086604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.086657 master-0 kubenswrapper[4790]: I1011 10:43:19.086652 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.087238 master-0 kubenswrapper[4790]: I1011 10:43:19.086802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.087238 master-0 kubenswrapper[4790]: I1011 10:43:19.086833 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.087238 master-0 kubenswrapper[4790]: I1011 10:43:19.086834 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/08bb0ac7b01a53ae0dcb90ce8b66efa1-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"08bb0ac7b01a53ae0dcb90ce8b66efa1\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:19.127879 master-0 kubenswrapper[4790]: I1011 10:43:19.127754 4790 generic.go:334] "Generic (PLEG): container finished" podID="e7063ccc-c150-41d0-9285-8a8ca00aa417" containerID="194353fb7acfeb121812b2d62c7722c179dced595ba3e814ace7d8070862578b" exitCode=0 Oct 11 10:43:19.127879 master-0 kubenswrapper[4790]: I1011 10:43:19.127838 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"e7063ccc-c150-41d0-9285-8a8ca00aa417","Type":"ContainerDied","Data":"194353fb7acfeb121812b2d62c7722c179dced595ba3e814ace7d8070862578b"} Oct 11 10:43:19.275640 master-0 kubenswrapper[4790]: I1011 10:43:19.275446 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:20.059744 master-0 kubenswrapper[4790]: I1011 10:43:20.059567 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:20.060882 master-0 kubenswrapper[4790]: I1011 10:43:20.059748 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:20.137389 master-0 kubenswrapper[4790]: I1011 10:43:20.137286 4790 generic.go:334] "Generic (PLEG): container finished" podID="08bb0ac7b01a53ae0dcb90ce8b66efa1" containerID="f212a76b747e114411a9d00eac6144e357bffebadcfd5266386a67eb7633032b" exitCode=0 Oct 11 10:43:20.137389 master-0 kubenswrapper[4790]: I1011 10:43:20.137366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerDied","Data":"f212a76b747e114411a9d00eac6144e357bffebadcfd5266386a67eb7633032b"} Oct 11 10:43:20.137781 master-0 kubenswrapper[4790]: I1011 10:43:20.137423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerStarted","Data":"165ad903b2032716ae9b5ae764f82587859a8868b43799d916b6208f93295787"} Oct 11 10:43:20.521966 master-0 kubenswrapper[4790]: I1011 10:43:20.521906 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:43:20.606869 master-0 kubenswrapper[4790]: I1011 10:43:20.606820 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7063ccc-c150-41d0-9285-8a8ca00aa417-kube-api-access\") pod \"e7063ccc-c150-41d0-9285-8a8ca00aa417\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " Oct 11 10:43:20.607021 master-0 kubenswrapper[4790]: I1011 10:43:20.606897 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-kubelet-dir\") pod \"e7063ccc-c150-41d0-9285-8a8ca00aa417\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " Oct 11 10:43:20.607021 master-0 kubenswrapper[4790]: I1011 10:43:20.606935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-var-lock\") pod \"e7063ccc-c150-41d0-9285-8a8ca00aa417\" (UID: \"e7063ccc-c150-41d0-9285-8a8ca00aa417\") " Oct 11 10:43:20.607234 master-0 kubenswrapper[4790]: I1011 10:43:20.607147 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7063ccc-c150-41d0-9285-8a8ca00aa417" (UID: "e7063ccc-c150-41d0-9285-8a8ca00aa417"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:43:20.607356 master-0 kubenswrapper[4790]: I1011 10:43:20.607195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-var-lock" (OuterVolumeSpecName: "var-lock") pod "e7063ccc-c150-41d0-9285-8a8ca00aa417" (UID: "e7063ccc-c150-41d0-9285-8a8ca00aa417"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:43:20.611137 master-0 kubenswrapper[4790]: I1011 10:43:20.611058 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7063ccc-c150-41d0-9285-8a8ca00aa417-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7063ccc-c150-41d0-9285-8a8ca00aa417" (UID: "e7063ccc-c150-41d0-9285-8a8ca00aa417"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:43:20.709234 master-0 kubenswrapper[4790]: I1011 10:43:20.709116 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7063ccc-c150-41d0-9285-8a8ca00aa417-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:43:20.709234 master-0 kubenswrapper[4790]: I1011 10:43:20.709176 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:43:20.709234 master-0 kubenswrapper[4790]: I1011 10:43:20.709190 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7063ccc-c150-41d0-9285-8a8ca00aa417-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:43:21.150139 master-0 kubenswrapper[4790]: I1011 10:43:21.150071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"e7063ccc-c150-41d0-9285-8a8ca00aa417","Type":"ContainerDied","Data":"df3c31d752a92f830ac660f11dc711746fedff638b520cb70b6e043fe897e4d1"} Oct 11 10:43:21.150576 master-0 kubenswrapper[4790]: I1011 10:43:21.150141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Oct 11 10:43:21.150576 master-0 kubenswrapper[4790]: I1011 10:43:21.150149 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3c31d752a92f830ac660f11dc711746fedff638b520cb70b6e043fe897e4d1" Oct 11 10:43:21.160695 master-0 kubenswrapper[4790]: I1011 10:43:21.159935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerStarted","Data":"ef053c2ec24777c6f73bf24c58ce6d81648a2919f61739a5cde9038627df1879"} Oct 11 10:43:21.160695 master-0 kubenswrapper[4790]: I1011 10:43:21.159992 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerStarted","Data":"489c4661078f82b6a0b68d83fed1c53684d72b0df178edbd87252ff524a895b3"} Oct 11 10:43:21.160695 master-0 kubenswrapper[4790]: I1011 10:43:21.160002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerStarted","Data":"df2872816af4c38af6c470c634071a8b5944f8a31faf49fe4569a9da208e13f7"} Oct 11 10:43:21.160695 master-0 kubenswrapper[4790]: I1011 10:43:21.160012 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerStarted","Data":"b09ec9f78f9897a605241691754a162993368392c8020db0290ac43a9862d1f3"} Oct 11 10:43:22.167498 master-0 kubenswrapper[4790]: I1011 10:43:22.167446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"08bb0ac7b01a53ae0dcb90ce8b66efa1","Type":"ContainerStarted","Data":"67efb8bfe5a6f1b758ca157bfcef80c59b85907f88572dd02d40afe6e9896027"} Oct 11 10:43:22.168356 master-0 kubenswrapper[4790]: I1011 10:43:22.168340 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:22.196639 master-0 kubenswrapper[4790]: I1011 10:43:22.196524 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=4.196501143 podStartE2EDuration="4.196501143s" podCreationTimestamp="2025-10-11 10:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:43:22.192667417 +0000 UTC m=+278.747127709" watchObservedRunningTime="2025-10-11 10:43:22.196501143 +0000 UTC m=+278.750961435" Oct 11 10:43:24.276360 master-0 kubenswrapper[4790]: I1011 10:43:24.276283 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:24.276360 master-0 kubenswrapper[4790]: I1011 10:43:24.276349 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:24.284871 master-0 kubenswrapper[4790]: I1011 10:43:24.284810 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:25.058895 master-0 kubenswrapper[4790]: I1011 10:43:25.058827 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:25.059174 master-0 kubenswrapper[4790]: I1011 10:43:25.058894 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:25.192553 master-0 kubenswrapper[4790]: I1011 10:43:25.192430 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:29.990273 master-0 kubenswrapper[4790]: I1011 10:43:29.990057 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-0"] Oct 11 10:43:29.990892 master-0 kubenswrapper[4790]: E1011 10:43:29.990366 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7063ccc-c150-41d0-9285-8a8ca00aa417" containerName="installer" Oct 11 10:43:29.990892 master-0 kubenswrapper[4790]: I1011 10:43:29.990390 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7063ccc-c150-41d0-9285-8a8ca00aa417" containerName="installer" Oct 11 10:43:29.990892 master-0 kubenswrapper[4790]: I1011 10:43:29.990532 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7063ccc-c150-41d0-9285-8a8ca00aa417" containerName="installer" Oct 11 10:43:29.991289 master-0 kubenswrapper[4790]: I1011 10:43:29.991254 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:29.993797 master-0 kubenswrapper[4790]: I1011 10:43:29.993745 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"default-dockercfg-hlr4b" Oct 11 10:43:29.995182 master-0 kubenswrapper[4790]: I1011 10:43:29.994949 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"openshift-service-ca.crt" Oct 11 10:43:29.995556 master-0 kubenswrapper[4790]: I1011 10:43:29.995465 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 11 10:43:30.014865 master-0 kubenswrapper[4790]: I1011 10:43:30.009521 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-0"] Oct 11 10:43:30.055919 master-0 kubenswrapper[4790]: I1011 10:43:30.055833 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2vct\" (UniqueName: \"kubernetes.io/projected/acfb978c-45a9-4081-9d1e-3751eea1b483-kube-api-access-v2vct\") pod \"kube-apiserver-guard-master-0\" (UID: \"acfb978c-45a9-4081-9d1e-3751eea1b483\") " pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:30.059293 master-0 kubenswrapper[4790]: I1011 10:43:30.059195 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:30.059422 master-0 kubenswrapper[4790]: I1011 10:43:30.059341 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:30.157083 master-0 kubenswrapper[4790]: I1011 10:43:30.156970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2vct\" (UniqueName: \"kubernetes.io/projected/acfb978c-45a9-4081-9d1e-3751eea1b483-kube-api-access-v2vct\") pod \"kube-apiserver-guard-master-0\" (UID: \"acfb978c-45a9-4081-9d1e-3751eea1b483\") " pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:30.183843 master-0 kubenswrapper[4790]: I1011 10:43:30.183751 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2vct\" (UniqueName: \"kubernetes.io/projected/acfb978c-45a9-4081-9d1e-3751eea1b483-kube-api-access-v2vct\") pod \"kube-apiserver-guard-master-0\" (UID: \"acfb978c-45a9-4081-9d1e-3751eea1b483\") " pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:30.322297 master-0 kubenswrapper[4790]: I1011 10:43:30.322175 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:30.811428 master-0 kubenswrapper[4790]: I1011 10:43:30.811149 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-0"] Oct 11 10:43:31.220137 master-0 kubenswrapper[4790]: I1011 10:43:31.220043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" event={"ID":"acfb978c-45a9-4081-9d1e-3751eea1b483","Type":"ContainerStarted","Data":"44f0928c4192874c201c618b86ce60c31a0620f8ec403ce90ee4b0b25f138ba0"} Oct 11 10:43:31.220137 master-0 kubenswrapper[4790]: I1011 10:43:31.220114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" event={"ID":"acfb978c-45a9-4081-9d1e-3751eea1b483","Type":"ContainerStarted","Data":"ac6ea93523ef5806f84360870a8fcd92a3cfc634675b21ddfa7edd0303dd1afc"} Oct 11 10:43:31.221407 master-0 kubenswrapper[4790]: I1011 10:43:31.220512 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:31.228829 master-0 kubenswrapper[4790]: I1011 10:43:31.228702 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" Oct 11 10:43:31.248526 master-0 kubenswrapper[4790]: I1011 10:43:31.248363 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-guard-master-0" podStartSLOduration=2.248318007 podStartE2EDuration="2.248318007s" podCreationTimestamp="2025-10-11 10:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:43:31.241653061 +0000 UTC m=+287.796113433" watchObservedRunningTime="2025-10-11 10:43:31.248318007 +0000 UTC m=+287.802778339" Oct 11 10:43:35.059329 master-0 kubenswrapper[4790]: I1011 10:43:35.059190 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:35.059329 master-0 kubenswrapper[4790]: I1011 10:43:35.059318 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:38.190374 master-0 kubenswrapper[4790]: I1011 10:43:38.190211 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-0"] Oct 11 10:43:39.281857 master-0 kubenswrapper[4790]: I1011 10:43:39.281772 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Oct 11 10:43:40.059341 master-0 kubenswrapper[4790]: I1011 10:43:40.059274 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:40.059806 master-0 kubenswrapper[4790]: I1011 10:43:40.059750 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:44.153021 master-0 kubenswrapper[4790]: I1011 10:43:44.152960 4790 kubelet.go:1505] "Image garbage collection succeeded" Oct 11 10:43:45.060697 master-0 kubenswrapper[4790]: I1011 10:43:45.060605 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:45.062989 master-0 kubenswrapper[4790]: I1011 10:43:45.060700 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:50.061206 master-0 kubenswrapper[4790]: I1011 10:43:50.060637 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:50.061206 master-0 kubenswrapper[4790]: I1011 10:43:50.060782 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:43:55.060073 master-0 kubenswrapper[4790]: I1011 10:43:55.059915 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:43:55.061214 master-0 kubenswrapper[4790]: I1011 10:43:55.060099 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:44:00.059463 master-0 kubenswrapper[4790]: I1011 10:44:00.059368 4790 patch_prober.go:28] interesting pod/apiserver-69df5d46bc-wjtq5 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" start-of-body= Oct 11 10:44:00.060517 master-0 kubenswrapper[4790]: I1011 10:44:00.059474 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.130.0.13:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.13:8443: connect: connection refused" Oct 11 10:44:03.765273 master-0 kubenswrapper[4790]: I1011 10:44:03.765197 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-656768b4df-9c8k6"] Oct 11 10:44:03.765826 master-0 kubenswrapper[4790]: I1011 10:44:03.765515 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" containerID="cri-o://35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c" gracePeriod=120 Oct 11 10:44:04.044321 master-0 kubenswrapper[4790]: I1011 10:44:04.044274 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:44:04.113615 master-0 kubenswrapper[4790]: I1011 10:44:04.113488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-image-import-ca\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.113615 master-0 kubenswrapper[4790]: I1011 10:44:04.113583 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-serving-ca\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.113615 master-0 kubenswrapper[4790]: I1011 10:44:04.113622 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-encryption-config\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.113680 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-client\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.113754 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88h25\" (UniqueName: \"kubernetes.io/projected/099ca022-6e9c-4604-b517-d90713dd6a44-kube-api-access-88h25\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.113921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-config\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.113999 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-trusted-ca-bundle\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.114031 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-node-pullsecrets\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.114079 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-audit-dir\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.114100 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-audit\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114128 master-0 kubenswrapper[4790]: I1011 10:44:04.114134 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-serving-cert\") pod \"099ca022-6e9c-4604-b517-d90713dd6a44\" (UID: \"099ca022-6e9c-4604-b517-d90713dd6a44\") " Oct 11 10:44:04.114564 master-0 kubenswrapper[4790]: I1011 10:44:04.114283 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:44:04.114610 master-0 kubenswrapper[4790]: I1011 10:44:04.114499 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:04.114658 master-0 kubenswrapper[4790]: I1011 10:44:04.114567 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:44:04.114701 master-0 kubenswrapper[4790]: I1011 10:44:04.114662 4790 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.114701 master-0 kubenswrapper[4790]: I1011 10:44:04.114682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:04.115031 master-0 kubenswrapper[4790]: I1011 10:44:04.114834 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-config" (OuterVolumeSpecName: "config") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:04.115143 master-0 kubenswrapper[4790]: I1011 10:44:04.115118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-audit" (OuterVolumeSpecName: "audit") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:04.115212 master-0 kubenswrapper[4790]: I1011 10:44:04.115140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:04.118097 master-0 kubenswrapper[4790]: I1011 10:44:04.118008 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099ca022-6e9c-4604-b517-d90713dd6a44-kube-api-access-88h25" (OuterVolumeSpecName: "kube-api-access-88h25") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "kube-api-access-88h25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:44:04.118929 master-0 kubenswrapper[4790]: I1011 10:44:04.118330 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:44:04.118929 master-0 kubenswrapper[4790]: I1011 10:44:04.118631 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:44:04.119915 master-0 kubenswrapper[4790]: I1011 10:44:04.119866 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "099ca022-6e9c-4604-b517-d90713dd6a44" (UID: "099ca022-6e9c-4604-b517-d90713dd6a44"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216136 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-client\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216187 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88h25\" (UniqueName: \"kubernetes.io/projected/099ca022-6e9c-4604-b517-d90713dd6a44-kube-api-access-88h25\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216204 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216217 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216229 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099ca022-6e9c-4604-b517-d90713dd6a44-audit-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216241 4790 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-audit\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216252 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-serving-cert\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216264 4790 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-image-import-ca\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216275 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099ca022-6e9c-4604-b517-d90713dd6a44-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.216247 master-0 kubenswrapper[4790]: I1011 10:44:04.216290 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099ca022-6e9c-4604-b517-d90713dd6a44-encryption-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:04.415391 master-0 kubenswrapper[4790]: I1011 10:44:04.415284 4790 generic.go:334] "Generic (PLEG): container finished" podID="099ca022-6e9c-4604-b517-d90713dd6a44" containerID="29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a" exitCode=0 Oct 11 10:44:04.415391 master-0 kubenswrapper[4790]: I1011 10:44:04.415365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerDied","Data":"29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a"} Oct 11 10:44:04.415997 master-0 kubenswrapper[4790]: I1011 10:44:04.415403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" Oct 11 10:44:04.415997 master-0 kubenswrapper[4790]: I1011 10:44:04.415455 4790 scope.go:117] "RemoveContainer" containerID="dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef" Oct 11 10:44:04.415997 master-0 kubenswrapper[4790]: I1011 10:44:04.415433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69df5d46bc-wjtq5" event={"ID":"099ca022-6e9c-4604-b517-d90713dd6a44","Type":"ContainerDied","Data":"1623fe4c5875e03dc8879c8e18034ad8d12a92c81f76f68ae1603f1e4ba99e21"} Oct 11 10:44:04.434469 master-0 kubenswrapper[4790]: I1011 10:44:04.434168 4790 scope.go:117] "RemoveContainer" containerID="29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a" Oct 11 10:44:04.445998 master-0 kubenswrapper[4790]: I1011 10:44:04.445914 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-69df5d46bc-wjtq5"] Oct 11 10:44:04.454029 master-0 kubenswrapper[4790]: I1011 10:44:04.453955 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-69df5d46bc-wjtq5"] Oct 11 10:44:04.456188 master-0 kubenswrapper[4790]: I1011 10:44:04.456142 4790 scope.go:117] "RemoveContainer" containerID="035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c" Oct 11 10:44:04.480156 master-0 kubenswrapper[4790]: I1011 10:44:04.480098 4790 scope.go:117] "RemoveContainer" containerID="dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef" Oct 11 10:44:04.480780 master-0 kubenswrapper[4790]: E1011 10:44:04.480729 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef\": container with ID starting with dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef not found: ID does not exist" containerID="dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef" Oct 11 10:44:04.480891 master-0 kubenswrapper[4790]: I1011 10:44:04.480791 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef"} err="failed to get container status \"dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef\": rpc error: code = NotFound desc = could not find container \"dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef\": container with ID starting with dd112f308e676b2c4d179d5aff947e42c2b7c6ef683c69f8c917e46b03a6eeef not found: ID does not exist" Oct 11 10:44:04.480891 master-0 kubenswrapper[4790]: I1011 10:44:04.480889 4790 scope.go:117] "RemoveContainer" containerID="29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a" Oct 11 10:44:04.481436 master-0 kubenswrapper[4790]: E1011 10:44:04.481395 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a\": container with ID starting with 29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a not found: ID does not exist" containerID="29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a" Oct 11 10:44:04.481483 master-0 kubenswrapper[4790]: I1011 10:44:04.481444 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a"} err="failed to get container status \"29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a\": rpc error: code = NotFound desc = could not find container \"29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a\": container with ID starting with 29c2667267dacd2b8bea9208822dffdfcb241820cba338aed8d3c7b0b7e5651a not found: ID does not exist" Oct 11 10:44:04.481522 master-0 kubenswrapper[4790]: I1011 10:44:04.481484 4790 scope.go:117] "RemoveContainer" containerID="035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c" Oct 11 10:44:04.481900 master-0 kubenswrapper[4790]: E1011 10:44:04.481840 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c\": container with ID starting with 035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c not found: ID does not exist" containerID="035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c" Oct 11 10:44:04.481944 master-0 kubenswrapper[4790]: I1011 10:44:04.481891 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c"} err="failed to get container status \"035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c\": rpc error: code = NotFound desc = could not find container \"035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c\": container with ID starting with 035fa9c3481f694000fd14dae9e171055c8a6110f7029a9f53a9e12f4eeb425c not found: ID does not exist" Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: I1011 10:44:05.048218 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:05.048279 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:05.049256 master-0 kubenswrapper[4790]: I1011 10:44:05.048298 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:06.298699 master-0 kubenswrapper[4790]: I1011 10:44:06.298588 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" path="/var/lib/kubelet/pods/099ca022-6e9c-4604-b517-d90713dd6a44/volumes" Oct 11 10:44:07.330936 master-0 kubenswrapper[4790]: I1011 10:44:07.330791 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f9d445f57-w4nwq"] Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: E1011 10:44:07.331174 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: I1011 10:44:07.331193 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: E1011 10:44:07.331206 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="fix-audit-permissions" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: I1011 10:44:07.331213 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="fix-audit-permissions" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: E1011 10:44:07.331225 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver-check-endpoints" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: I1011 10:44:07.331232 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver-check-endpoints" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: I1011 10:44:07.331294 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver" Oct 11 10:44:07.331472 master-0 kubenswrapper[4790]: I1011 10:44:07.331303 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="099ca022-6e9c-4604-b517-d90713dd6a44" containerName="openshift-apiserver-check-endpoints" Oct 11 10:44:07.331825 master-0 kubenswrapper[4790]: I1011 10:44:07.331795 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.336211 master-0 kubenswrapper[4790]: I1011 10:44:07.336086 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 11 10:44:07.336469 master-0 kubenswrapper[4790]: I1011 10:44:07.336293 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-8wmjp" Oct 11 10:44:07.336469 master-0 kubenswrapper[4790]: I1011 10:44:07.336365 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 11 10:44:07.336740 master-0 kubenswrapper[4790]: I1011 10:44:07.336675 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 11 10:44:07.336840 master-0 kubenswrapper[4790]: I1011 10:44:07.336815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 11 10:44:07.337107 master-0 kubenswrapper[4790]: I1011 10:44:07.337062 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 11 10:44:07.337336 master-0 kubenswrapper[4790]: I1011 10:44:07.337292 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 11 10:44:07.337613 master-0 kubenswrapper[4790]: I1011 10:44:07.337568 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 11 10:44:07.346663 master-0 kubenswrapper[4790]: I1011 10:44:07.346605 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 11 10:44:07.374150 master-0 kubenswrapper[4790]: I1011 10:44:07.374094 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f9d445f57-w4nwq"] Oct 11 10:44:07.391595 master-0 kubenswrapper[4790]: I1011 10:44:07.391488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv7r\" (UniqueName: \"kubernetes.io/projected/e299247b-558b-4b6c-9d7c-335475344fdc-kube-api-access-brv7r\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.391595 master-0 kubenswrapper[4790]: I1011 10:44:07.391589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-serving-cert\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.391907 master-0 kubenswrapper[4790]: I1011 10:44:07.391617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-console-config\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.391907 master-0 kubenswrapper[4790]: I1011 10:44:07.391632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-oauth-serving-cert\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.391907 master-0 kubenswrapper[4790]: I1011 10:44:07.391651 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-service-ca\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.391907 master-0 kubenswrapper[4790]: I1011 10:44:07.391684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-trusted-ca-bundle\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.391907 master-0 kubenswrapper[4790]: I1011 10:44:07.391743 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-oauth-config\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493109 master-0 kubenswrapper[4790]: I1011 10:44:07.493017 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-console-config\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493109 master-0 kubenswrapper[4790]: I1011 10:44:07.493095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-oauth-serving-cert\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493408 master-0 kubenswrapper[4790]: I1011 10:44:07.493156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-service-ca\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493408 master-0 kubenswrapper[4790]: I1011 10:44:07.493200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-trusted-ca-bundle\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493408 master-0 kubenswrapper[4790]: I1011 10:44:07.493250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-oauth-config\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493408 master-0 kubenswrapper[4790]: I1011 10:44:07.493275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv7r\" (UniqueName: \"kubernetes.io/projected/e299247b-558b-4b6c-9d7c-335475344fdc-kube-api-access-brv7r\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.493408 master-0 kubenswrapper[4790]: I1011 10:44:07.493300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-serving-cert\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.494485 master-0 kubenswrapper[4790]: I1011 10:44:07.494439 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-oauth-serving-cert\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.494785 master-0 kubenswrapper[4790]: I1011 10:44:07.494738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-service-ca\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.494838 master-0 kubenswrapper[4790]: I1011 10:44:07.494809 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-trusted-ca-bundle\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.495138 master-0 kubenswrapper[4790]: I1011 10:44:07.495085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-console-config\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.500693 master-0 kubenswrapper[4790]: I1011 10:44:07.500613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-serving-cert\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.500904 master-0 kubenswrapper[4790]: I1011 10:44:07.500852 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-oauth-config\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.513160 master-0 kubenswrapper[4790]: I1011 10:44:07.513102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv7r\" (UniqueName: \"kubernetes.io/projected/e299247b-558b-4b6c-9d7c-335475344fdc-kube-api-access-brv7r\") pod \"console-6f9d445f57-w4nwq\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:07.648417 master-0 kubenswrapper[4790]: I1011 10:44:07.647931 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:08.104148 master-0 kubenswrapper[4790]: I1011 10:44:08.104078 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f9d445f57-w4nwq"] Oct 11 10:44:08.106353 master-0 kubenswrapper[4790]: W1011 10:44:08.106309 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode299247b_558b_4b6c_9d7c_335475344fdc.slice/crio-c55a75264f621b83bca51a6b3abc1339de4396e41f1c001671f54d97f3acff21 WatchSource:0}: Error finding container c55a75264f621b83bca51a6b3abc1339de4396e41f1c001671f54d97f3acff21: Status 404 returned error can't find the container with id c55a75264f621b83bca51a6b3abc1339de4396e41f1c001671f54d97f3acff21 Oct 11 10:44:08.438419 master-0 kubenswrapper[4790]: I1011 10:44:08.438248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9d445f57-w4nwq" event={"ID":"e299247b-558b-4b6c-9d7c-335475344fdc","Type":"ContainerStarted","Data":"3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0"} Oct 11 10:44:08.438419 master-0 kubenswrapper[4790]: I1011 10:44:08.438317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9d445f57-w4nwq" event={"ID":"e299247b-558b-4b6c-9d7c-335475344fdc","Type":"ContainerStarted","Data":"c55a75264f621b83bca51a6b3abc1339de4396e41f1c001671f54d97f3acff21"} Oct 11 10:44:08.468820 master-0 kubenswrapper[4790]: I1011 10:44:08.468643 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f9d445f57-w4nwq" podStartSLOduration=1.4685880230000001 podStartE2EDuration="1.468588023s" podCreationTimestamp="2025-10-11 10:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:44:08.461915698 +0000 UTC m=+325.016376010" watchObservedRunningTime="2025-10-11 10:44:08.468588023 +0000 UTC m=+325.023048345" Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: I1011 10:44:10.050349 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:10.050410 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:10.051507 master-0 kubenswrapper[4790]: I1011 10:44:10.051469 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:13.691031 master-0 kubenswrapper[4790]: I1011 10:44:13.689805 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-8865994fd-4bs48"] Oct 11 10:44:13.691996 master-0 kubenswrapper[4790]: I1011 10:44:13.691939 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.695860 master-0 kubenswrapper[4790]: I1011 10:44:13.695830 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 11 10:44:13.696057 master-0 kubenswrapper[4790]: I1011 10:44:13.695861 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 11 10:44:13.696057 master-0 kubenswrapper[4790]: I1011 10:44:13.696035 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 11 10:44:13.696196 master-0 kubenswrapper[4790]: I1011 10:44:13.696074 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 11 10:44:13.696196 master-0 kubenswrapper[4790]: I1011 10:44:13.696112 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 11 10:44:13.696196 master-0 kubenswrapper[4790]: I1011 10:44:13.695885 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 11 10:44:13.696196 master-0 kubenswrapper[4790]: I1011 10:44:13.695888 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-lntq9" Oct 11 10:44:13.696389 master-0 kubenswrapper[4790]: I1011 10:44:13.696145 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 11 10:44:13.696447 master-0 kubenswrapper[4790]: I1011 10:44:13.696407 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 11 10:44:13.696447 master-0 kubenswrapper[4790]: I1011 10:44:13.696437 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 11 10:44:13.706164 master-0 kubenswrapper[4790]: I1011 10:44:13.706119 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8865994fd-4bs48"] Oct 11 10:44:13.706925 master-0 kubenswrapper[4790]: I1011 10:44:13.706875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 11 10:44:13.786022 master-0 kubenswrapper[4790]: I1011 10:44:13.785949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-trusted-ca-bundle\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.786402 master-0 kubenswrapper[4790]: I1011 10:44:13.786376 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-image-import-ca\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.786529 master-0 kubenswrapper[4790]: I1011 10:44:13.786504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-audit\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.786645 master-0 kubenswrapper[4790]: I1011 10:44:13.786625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-config\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.786836 master-0 kubenswrapper[4790]: I1011 10:44:13.786800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-encryption-config\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.787002 master-0 kubenswrapper[4790]: I1011 10:44:13.786979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-etcd-client\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.787306 master-0 kubenswrapper[4790]: I1011 10:44:13.787287 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxkg\" (UniqueName: \"kubernetes.io/projected/cdf415dc-3a2a-4f52-90ae-81a963771876-kube-api-access-lzxkg\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.787436 master-0 kubenswrapper[4790]: I1011 10:44:13.787420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdf415dc-3a2a-4f52-90ae-81a963771876-audit-dir\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.787584 master-0 kubenswrapper[4790]: I1011 10:44:13.787560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cdf415dc-3a2a-4f52-90ae-81a963771876-node-pullsecrets\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.787752 master-0 kubenswrapper[4790]: I1011 10:44:13.787729 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-etcd-serving-ca\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.787920 master-0 kubenswrapper[4790]: I1011 10:44:13.787892 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-serving-cert\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.889801 master-0 kubenswrapper[4790]: I1011 10:44:13.889663 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-serving-cert\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.889801 master-0 kubenswrapper[4790]: I1011 10:44:13.889814 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-trusted-ca-bundle\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.889849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-image-import-ca\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.889895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-audit\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.889931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-config\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.889959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-encryption-config\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.889991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-etcd-client\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.890022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxkg\" (UniqueName: \"kubernetes.io/projected/cdf415dc-3a2a-4f52-90ae-81a963771876-kube-api-access-lzxkg\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.890270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdf415dc-3a2a-4f52-90ae-81a963771876-audit-dir\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.890308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cdf415dc-3a2a-4f52-90ae-81a963771876-node-pullsecrets\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.890384 master-0 kubenswrapper[4790]: I1011 10:44:13.890345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-etcd-serving-ca\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.891250 master-0 kubenswrapper[4790]: I1011 10:44:13.891211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdf415dc-3a2a-4f52-90ae-81a963771876-audit-dir\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.891465 master-0 kubenswrapper[4790]: I1011 10:44:13.891427 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-etcd-serving-ca\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.891649 master-0 kubenswrapper[4790]: I1011 10:44:13.891565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cdf415dc-3a2a-4f52-90ae-81a963771876-node-pullsecrets\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.892589 master-0 kubenswrapper[4790]: I1011 10:44:13.892539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-config\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.892930 master-0 kubenswrapper[4790]: I1011 10:44:13.892860 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-image-import-ca\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.893219 master-0 kubenswrapper[4790]: I1011 10:44:13.893157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-audit\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.894152 master-0 kubenswrapper[4790]: I1011 10:44:13.894082 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf415dc-3a2a-4f52-90ae-81a963771876-trusted-ca-bundle\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.896135 master-0 kubenswrapper[4790]: I1011 10:44:13.896088 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-encryption-config\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.897518 master-0 kubenswrapper[4790]: I1011 10:44:13.897494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-serving-cert\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.897739 master-0 kubenswrapper[4790]: I1011 10:44:13.897649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cdf415dc-3a2a-4f52-90ae-81a963771876-etcd-client\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:13.914266 master-0 kubenswrapper[4790]: I1011 10:44:13.914197 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxkg\" (UniqueName: \"kubernetes.io/projected/cdf415dc-3a2a-4f52-90ae-81a963771876-kube-api-access-lzxkg\") pod \"apiserver-8865994fd-4bs48\" (UID: \"cdf415dc-3a2a-4f52-90ae-81a963771876\") " pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:14.017158 master-0 kubenswrapper[4790]: I1011 10:44:14.016941 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:14.454616 master-0 kubenswrapper[4790]: I1011 10:44:14.454247 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8865994fd-4bs48"] Oct 11 10:44:14.461493 master-0 kubenswrapper[4790]: W1011 10:44:14.461439 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf415dc_3a2a_4f52_90ae_81a963771876.slice/crio-2e0145fee607e1061480a46f1974f026117c6f9a12920376f8680f7f88b90732 WatchSource:0}: Error finding container 2e0145fee607e1061480a46f1974f026117c6f9a12920376f8680f7f88b90732: Status 404 returned error can't find the container with id 2e0145fee607e1061480a46f1974f026117c6f9a12920376f8680f7f88b90732 Oct 11 10:44:14.469359 master-0 kubenswrapper[4790]: I1011 10:44:14.469291 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8865994fd-4bs48" event={"ID":"cdf415dc-3a2a-4f52-90ae-81a963771876","Type":"ContainerStarted","Data":"2e0145fee607e1061480a46f1974f026117c6f9a12920376f8680f7f88b90732"} Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: I1011 10:44:15.049661 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:15.049809 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:15.052131 master-0 kubenswrapper[4790]: I1011 10:44:15.049827 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:15.052131 master-0 kubenswrapper[4790]: I1011 10:44:15.050119 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:44:15.480309 master-0 kubenswrapper[4790]: I1011 10:44:15.480143 4790 generic.go:334] "Generic (PLEG): container finished" podID="cdf415dc-3a2a-4f52-90ae-81a963771876" containerID="44a2752364aefd0e8ef37a371fc1a02b554d307c7f399e48a010a918f35a11b1" exitCode=0 Oct 11 10:44:15.480309 master-0 kubenswrapper[4790]: I1011 10:44:15.480209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8865994fd-4bs48" event={"ID":"cdf415dc-3a2a-4f52-90ae-81a963771876","Type":"ContainerDied","Data":"44a2752364aefd0e8ef37a371fc1a02b554d307c7f399e48a010a918f35a11b1"} Oct 11 10:44:16.489040 master-0 kubenswrapper[4790]: I1011 10:44:16.488928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8865994fd-4bs48" event={"ID":"cdf415dc-3a2a-4f52-90ae-81a963771876","Type":"ContainerStarted","Data":"6f0df9b47b168a17643529973b548904eef2d99ee4ab5088c9894dedd37f5d9b"} Oct 11 10:44:16.489040 master-0 kubenswrapper[4790]: I1011 10:44:16.488989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8865994fd-4bs48" event={"ID":"cdf415dc-3a2a-4f52-90ae-81a963771876","Type":"ContainerStarted","Data":"ec910e69b540b523a147a7d7f29286f549aed08e47b9bfba169c2a9f41b1dc75"} Oct 11 10:44:16.525768 master-0 kubenswrapper[4790]: I1011 10:44:16.525564 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-8865994fd-4bs48" podStartSLOduration=123.525525845 podStartE2EDuration="2m3.525525845s" podCreationTimestamp="2025-10-11 10:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:44:16.520451716 +0000 UTC m=+333.074912028" watchObservedRunningTime="2025-10-11 10:44:16.525525845 +0000 UTC m=+333.079986177" Oct 11 10:44:17.648370 master-0 kubenswrapper[4790]: I1011 10:44:17.648284 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:17.648370 master-0 kubenswrapper[4790]: I1011 10:44:17.648367 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:17.650260 master-0 kubenswrapper[4790]: I1011 10:44:17.650215 4790 patch_prober.go:28] interesting pod/console-6f9d445f57-w4nwq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.27:8443/health\": dial tcp 10.130.0.27:8443: connect: connection refused" start-of-body= Oct 11 10:44:17.650307 master-0 kubenswrapper[4790]: I1011 10:44:17.650276 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6f9d445f57-w4nwq" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" containerName="console" probeResult="failure" output="Get \"https://10.130.0.27:8443/health\": dial tcp 10.130.0.27:8443: connect: connection refused" Oct 11 10:44:19.017742 master-0 kubenswrapper[4790]: I1011 10:44:19.017600 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:19.017742 master-0 kubenswrapper[4790]: I1011 10:44:19.017689 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:19.024759 master-0 kubenswrapper[4790]: I1011 10:44:19.024678 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:19.514471 master-0 kubenswrapper[4790]: I1011 10:44:19.514408 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-8865994fd-4bs48" Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: I1011 10:44:20.052686 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:20.052830 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:20.053980 master-0 kubenswrapper[4790]: I1011 10:44:20.052845 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: I1011 10:44:25.050499 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:25.050581 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:25.052304 master-0 kubenswrapper[4790]: I1011 10:44:25.050598 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:27.648936 master-0 kubenswrapper[4790]: I1011 10:44:27.648824 4790 patch_prober.go:28] interesting pod/console-6f9d445f57-w4nwq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.130.0.27:8443/health\": dial tcp 10.130.0.27:8443: connect: connection refused" start-of-body= Oct 11 10:44:27.648936 master-0 kubenswrapper[4790]: I1011 10:44:27.648917 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6f9d445f57-w4nwq" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" containerName="console" probeResult="failure" output="Get \"https://10.130.0.27:8443/health\": dial tcp 10.130.0.27:8443: connect: connection refused" Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: I1011 10:44:30.050517 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:30.050608 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:30.051691 master-0 kubenswrapper[4790]: I1011 10:44:30.050624 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: I1011 10:44:35.048939 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:35.049030 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:35.050298 master-0 kubenswrapper[4790]: I1011 10:44:35.049075 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:37.655544 master-0 kubenswrapper[4790]: I1011 10:44:37.655403 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:37.660280 master-0 kubenswrapper[4790]: I1011 10:44:37.660208 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: I1011 10:44:40.049160 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:40.049249 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:40.050161 master-0 kubenswrapper[4790]: I1011 10:44:40.049278 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: I1011 10:44:45.051755 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:45.051839 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:45.052854 master-0 kubenswrapper[4790]: I1011 10:44:45.051861 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: I1011 10:44:50.050826 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]log ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]etcd excluded: ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]etcd-readiness excluded: ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]informer-sync ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: [-]shutdown failed: reason withheld Oct 11 10:44:50.050927 master-0 kubenswrapper[4790]: readyz check failed Oct 11 10:44:50.052206 master-0 kubenswrapper[4790]: I1011 10:44:50.050941 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 10:44:55.044772 master-0 kubenswrapper[4790]: I1011 10:44:55.044649 4790 patch_prober.go:28] interesting pod/apiserver-656768b4df-9c8k6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.130.0.12:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.12:8443: connect: connection refused" start-of-body= Oct 11 10:44:55.045500 master-0 kubenswrapper[4790]: I1011 10:44:55.044846 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.130.0.12:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.130.0.12:8443: connect: connection refused" Oct 11 10:44:56.354881 master-0 kubenswrapper[4790]: I1011 10:44:56.354803 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:44:56.498396 master-0 kubenswrapper[4790]: I1011 10:44:56.498279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-trusted-ca-bundle\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498396 master-0 kubenswrapper[4790]: I1011 10:44:56.498398 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-policies\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498795 master-0 kubenswrapper[4790]: I1011 10:44:56.498454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-dir\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498795 master-0 kubenswrapper[4790]: I1011 10:44:56.498522 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-client\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498795 master-0 kubenswrapper[4790]: I1011 10:44:56.498620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-serving-ca\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498795 master-0 kubenswrapper[4790]: I1011 10:44:56.498616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:44:56.498795 master-0 kubenswrapper[4790]: I1011 10:44:56.498686 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-encryption-config\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498795 master-0 kubenswrapper[4790]: I1011 10:44:56.498792 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-serving-cert\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.498969 master-0 kubenswrapper[4790]: I1011 10:44:56.498933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2kt7\" (UniqueName: \"kubernetes.io/projected/76dd8647-4ad5-4874-b2d2-dee16aab637a-kube-api-access-k2kt7\") pod \"76dd8647-4ad5-4874-b2d2-dee16aab637a\" (UID: \"76dd8647-4ad5-4874-b2d2-dee16aab637a\") " Oct 11 10:44:56.499120 master-0 kubenswrapper[4790]: I1011 10:44:56.499084 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:56.499391 master-0 kubenswrapper[4790]: I1011 10:44:56.499336 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.499451 master-0 kubenswrapper[4790]: I1011 10:44:56.499394 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.499834 master-0 kubenswrapper[4790]: I1011 10:44:56.499755 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:56.499881 master-0 kubenswrapper[4790]: I1011 10:44:56.499791 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:44:56.502481 master-0 kubenswrapper[4790]: I1011 10:44:56.502438 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:44:56.503109 master-0 kubenswrapper[4790]: I1011 10:44:56.503001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:44:56.503418 master-0 kubenswrapper[4790]: I1011 10:44:56.503278 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:44:56.503547 master-0 kubenswrapper[4790]: I1011 10:44:56.503497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76dd8647-4ad5-4874-b2d2-dee16aab637a-kube-api-access-k2kt7" (OuterVolumeSpecName: "kube-api-access-k2kt7") pod "76dd8647-4ad5-4874-b2d2-dee16aab637a" (UID: "76dd8647-4ad5-4874-b2d2-dee16aab637a"). InnerVolumeSpecName "kube-api-access-k2kt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:44:56.600210 master-0 kubenswrapper[4790]: I1011 10:44:56.600003 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.600210 master-0 kubenswrapper[4790]: I1011 10:44:56.600049 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-encryption-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.600210 master-0 kubenswrapper[4790]: I1011 10:44:56.600061 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-serving-cert\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.600210 master-0 kubenswrapper[4790]: I1011 10:44:56.600071 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2kt7\" (UniqueName: \"kubernetes.io/projected/76dd8647-4ad5-4874-b2d2-dee16aab637a-kube-api-access-k2kt7\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.600210 master-0 kubenswrapper[4790]: I1011 10:44:56.600085 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76dd8647-4ad5-4874-b2d2-dee16aab637a-audit-policies\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.600210 master-0 kubenswrapper[4790]: I1011 10:44:56.600093 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76dd8647-4ad5-4874-b2d2-dee16aab637a-etcd-client\") on node \"master-0\" DevicePath \"\"" Oct 11 10:44:56.717503 master-0 kubenswrapper[4790]: I1011 10:44:56.717375 4790 generic.go:334] "Generic (PLEG): container finished" podID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerID="35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c" exitCode=0 Oct 11 10:44:56.717503 master-0 kubenswrapper[4790]: I1011 10:44:56.717456 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" event={"ID":"76dd8647-4ad5-4874-b2d2-dee16aab637a","Type":"ContainerDied","Data":"35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c"} Oct 11 10:44:56.717503 master-0 kubenswrapper[4790]: I1011 10:44:56.717509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" event={"ID":"76dd8647-4ad5-4874-b2d2-dee16aab637a","Type":"ContainerDied","Data":"64ce93912fbe2ce263f72579fc62109333989150c0bd59c119eb0bd06f24caa2"} Oct 11 10:44:56.717857 master-0 kubenswrapper[4790]: I1011 10:44:56.717508 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-656768b4df-9c8k6" Oct 11 10:44:56.717857 master-0 kubenswrapper[4790]: I1011 10:44:56.717542 4790 scope.go:117] "RemoveContainer" containerID="35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c" Oct 11 10:44:56.731772 master-0 kubenswrapper[4790]: I1011 10:44:56.731675 4790 scope.go:117] "RemoveContainer" containerID="ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331" Oct 11 10:44:56.749364 master-0 kubenswrapper[4790]: I1011 10:44:56.749288 4790 scope.go:117] "RemoveContainer" containerID="35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c" Oct 11 10:44:56.750219 master-0 kubenswrapper[4790]: E1011 10:44:56.750121 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c\": container with ID starting with 35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c not found: ID does not exist" containerID="35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c" Oct 11 10:44:56.750360 master-0 kubenswrapper[4790]: I1011 10:44:56.750227 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c"} err="failed to get container status \"35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c\": rpc error: code = NotFound desc = could not find container \"35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c\": container with ID starting with 35472050aef2a23d480cd0b8022e9ecdb2080124eec5c8fed4e4fce84c05d72c not found: ID does not exist" Oct 11 10:44:56.750360 master-0 kubenswrapper[4790]: I1011 10:44:56.750304 4790 scope.go:117] "RemoveContainer" containerID="ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331" Oct 11 10:44:56.751049 master-0 kubenswrapper[4790]: E1011 10:44:56.750946 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331\": container with ID starting with ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331 not found: ID does not exist" containerID="ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331" Oct 11 10:44:56.751183 master-0 kubenswrapper[4790]: I1011 10:44:56.751080 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331"} err="failed to get container status \"ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331\": rpc error: code = NotFound desc = could not find container \"ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331\": container with ID starting with ace4f650a5b9b61d268cf3bf1c5cdb946038997e7050cf67fa731d305fe20331 not found: ID does not exist" Oct 11 10:44:56.755184 master-0 kubenswrapper[4790]: I1011 10:44:56.755060 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-656768b4df-9c8k6"] Oct 11 10:44:56.760595 master-0 kubenswrapper[4790]: I1011 10:44:56.760521 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-656768b4df-9c8k6"] Oct 11 10:44:58.303670 master-0 kubenswrapper[4790]: I1011 10:44:58.303572 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" path="/var/lib/kubelet/pods/76dd8647-4ad5-4874-b2d2-dee16aab637a/volumes" Oct 11 10:45:02.720654 master-0 kubenswrapper[4790]: I1011 10:45:02.720543 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r"] Oct 11 10:45:02.721830 master-0 kubenswrapper[4790]: E1011 10:45:02.720855 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" Oct 11 10:45:02.721830 master-0 kubenswrapper[4790]: I1011 10:45:02.720874 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" Oct 11 10:45:02.721830 master-0 kubenswrapper[4790]: E1011 10:45:02.720894 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="fix-audit-permissions" Oct 11 10:45:02.721830 master-0 kubenswrapper[4790]: I1011 10:45:02.720903 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="fix-audit-permissions" Oct 11 10:45:02.721830 master-0 kubenswrapper[4790]: I1011 10:45:02.720985 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dd8647-4ad5-4874-b2d2-dee16aab637a" containerName="oauth-apiserver" Oct 11 10:45:02.721830 master-0 kubenswrapper[4790]: I1011 10:45:02.721594 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.725090 master-0 kubenswrapper[4790]: I1011 10:45:02.725033 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 11 10:45:02.725090 master-0 kubenswrapper[4790]: I1011 10:45:02.725056 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 11 10:45:02.725380 master-0 kubenswrapper[4790]: I1011 10:45:02.725312 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 11 10:45:02.726027 master-0 kubenswrapper[4790]: I1011 10:45:02.725934 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 11 10:45:02.726094 master-0 kubenswrapper[4790]: I1011 10:45:02.726055 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 11 10:45:02.726094 master-0 kubenswrapper[4790]: I1011 10:45:02.725953 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 11 10:45:02.726183 master-0 kubenswrapper[4790]: I1011 10:45:02.726102 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 11 10:45:02.726228 master-0 kubenswrapper[4790]: I1011 10:45:02.726150 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-zlnjr" Oct 11 10:45:02.728830 master-0 kubenswrapper[4790]: I1011 10:45:02.728795 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 11 10:45:02.742879 master-0 kubenswrapper[4790]: I1011 10:45:02.742582 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r"] Oct 11 10:45:02.883566 master-0 kubenswrapper[4790]: I1011 10:45:02.883455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-etcd-serving-ca\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883566 master-0 kubenswrapper[4790]: I1011 10:45:02.883538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-trusted-ca-bundle\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883566 master-0 kubenswrapper[4790]: I1011 10:45:02.883567 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-etcd-client\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883970 master-0 kubenswrapper[4790]: I1011 10:45:02.883590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjvvj\" (UniqueName: \"kubernetes.io/projected/bc183705-096c-4af1-adf7-d3cd0e4532e1-kube-api-access-gjvvj\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883970 master-0 kubenswrapper[4790]: I1011 10:45:02.883608 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-encryption-config\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883970 master-0 kubenswrapper[4790]: I1011 10:45:02.883629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc183705-096c-4af1-adf7-d3cd0e4532e1-audit-dir\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883970 master-0 kubenswrapper[4790]: I1011 10:45:02.883645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-serving-cert\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.883970 master-0 kubenswrapper[4790]: I1011 10:45:02.883662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-audit-policies\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985312 master-0 kubenswrapper[4790]: I1011 10:45:02.985156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-encryption-config\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985312 master-0 kubenswrapper[4790]: I1011 10:45:02.985235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc183705-096c-4af1-adf7-d3cd0e4532e1-audit-dir\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985312 master-0 kubenswrapper[4790]: I1011 10:45:02.985262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-serving-cert\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985312 master-0 kubenswrapper[4790]: I1011 10:45:02.985285 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-audit-policies\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985607 master-0 kubenswrapper[4790]: I1011 10:45:02.985356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-etcd-serving-ca\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985607 master-0 kubenswrapper[4790]: I1011 10:45:02.985394 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-trusted-ca-bundle\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985607 master-0 kubenswrapper[4790]: I1011 10:45:02.985421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-etcd-client\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.985607 master-0 kubenswrapper[4790]: I1011 10:45:02.985443 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjvvj\" (UniqueName: \"kubernetes.io/projected/bc183705-096c-4af1-adf7-d3cd0e4532e1-kube-api-access-gjvvj\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.986442 master-0 kubenswrapper[4790]: I1011 10:45:02.986378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc183705-096c-4af1-adf7-d3cd0e4532e1-audit-dir\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.986882 master-0 kubenswrapper[4790]: I1011 10:45:02.986841 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-trusted-ca-bundle\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.987154 master-0 kubenswrapper[4790]: I1011 10:45:02.987108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-etcd-serving-ca\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.987154 master-0 kubenswrapper[4790]: I1011 10:45:02.987136 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bc183705-096c-4af1-adf7-d3cd0e4532e1-audit-policies\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.989243 master-0 kubenswrapper[4790]: I1011 10:45:02.989179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-etcd-client\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.989576 master-0 kubenswrapper[4790]: I1011 10:45:02.989533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-serving-cert\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:02.991360 master-0 kubenswrapper[4790]: I1011 10:45:02.991265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc183705-096c-4af1-adf7-d3cd0e4532e1-encryption-config\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:03.013411 master-0 kubenswrapper[4790]: I1011 10:45:03.013301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjvvj\" (UniqueName: \"kubernetes.io/projected/bc183705-096c-4af1-adf7-d3cd0e4532e1-kube-api-access-gjvvj\") pod \"apiserver-68f4c55ff4-nk86r\" (UID: \"bc183705-096c-4af1-adf7-d3cd0e4532e1\") " pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:03.048177 master-0 kubenswrapper[4790]: I1011 10:45:03.048088 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:03.476071 master-0 kubenswrapper[4790]: I1011 10:45:03.475991 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r"] Oct 11 10:45:03.484763 master-0 kubenswrapper[4790]: W1011 10:45:03.484486 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc183705_096c_4af1_adf7_d3cd0e4532e1.slice/crio-cc5e24850d0f6f614e758fd3bf1c6e67c3053d70b28fafb79a2d04da058ebe3c WatchSource:0}: Error finding container cc5e24850d0f6f614e758fd3bf1c6e67c3053d70b28fafb79a2d04da058ebe3c: Status 404 returned error can't find the container with id cc5e24850d0f6f614e758fd3bf1c6e67c3053d70b28fafb79a2d04da058ebe3c Oct 11 10:45:03.760058 master-0 kubenswrapper[4790]: I1011 10:45:03.759969 4790 generic.go:334] "Generic (PLEG): container finished" podID="bc183705-096c-4af1-adf7-d3cd0e4532e1" containerID="be32288dd089ab960bc2afabfe55a2399594c886fbc3d68706d27c242828cb8b" exitCode=0 Oct 11 10:45:03.760799 master-0 kubenswrapper[4790]: I1011 10:45:03.760073 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" event={"ID":"bc183705-096c-4af1-adf7-d3cd0e4532e1","Type":"ContainerDied","Data":"be32288dd089ab960bc2afabfe55a2399594c886fbc3d68706d27c242828cb8b"} Oct 11 10:45:03.760799 master-0 kubenswrapper[4790]: I1011 10:45:03.760130 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" event={"ID":"bc183705-096c-4af1-adf7-d3cd0e4532e1","Type":"ContainerStarted","Data":"cc5e24850d0f6f614e758fd3bf1c6e67c3053d70b28fafb79a2d04da058ebe3c"} Oct 11 10:45:04.770305 master-0 kubenswrapper[4790]: I1011 10:45:04.770185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" event={"ID":"bc183705-096c-4af1-adf7-d3cd0e4532e1","Type":"ContainerStarted","Data":"3bc29af6e88c7b8b34419367f59f128bbade88fe51d2455bcbbe1a01fe1d8528"} Oct 11 10:45:04.800496 master-0 kubenswrapper[4790]: I1011 10:45:04.800014 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" podStartSLOduration=61.799985373 podStartE2EDuration="1m1.799985373s" podCreationTimestamp="2025-10-11 10:44:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:45:04.798249127 +0000 UTC m=+381.352709499" watchObservedRunningTime="2025-10-11 10:45:04.799985373 +0000 UTC m=+381.354445675" Oct 11 10:45:08.048921 master-0 kubenswrapper[4790]: I1011 10:45:08.048806 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:08.048921 master-0 kubenswrapper[4790]: I1011 10:45:08.048928 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:08.062396 master-0 kubenswrapper[4790]: I1011 10:45:08.062325 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:45:08.815581 master-0 kubenswrapper[4790]: I1011 10:45:08.815189 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-68f4c55ff4-nk86r" Oct 11 10:46:34.076947 master-0 kubenswrapper[4790]: I1011 10:46:34.076885 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/revision-pruner-10-master-0"] Oct 11 10:46:34.078099 master-0 kubenswrapper[4790]: I1011 10:46:34.078052 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.082804 master-0 kubenswrapper[4790]: I1011 10:46:34.082680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-xbqxb" Oct 11 10:46:34.095111 master-0 kubenswrapper[4790]: I1011 10:46:34.095020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/revision-pruner-10-master-0"] Oct 11 10:46:34.237201 master-0 kubenswrapper[4790]: I1011 10:46:34.237080 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kubelet-dir\") pod \"revision-pruner-10-master-0\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.237593 master-0 kubenswrapper[4790]: I1011 10:46:34.237271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kube-api-access\") pod \"revision-pruner-10-master-0\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.338510 master-0 kubenswrapper[4790]: I1011 10:46:34.338281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kube-api-access\") pod \"revision-pruner-10-master-0\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.338510 master-0 kubenswrapper[4790]: I1011 10:46:34.338413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kubelet-dir\") pod \"revision-pruner-10-master-0\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.339039 master-0 kubenswrapper[4790]: I1011 10:46:34.338620 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kubelet-dir\") pod \"revision-pruner-10-master-0\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.360432 master-0 kubenswrapper[4790]: I1011 10:46:34.360348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kube-api-access\") pod \"revision-pruner-10-master-0\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.410526 master-0 kubenswrapper[4790]: I1011 10:46:34.410430 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:34.905336 master-0 kubenswrapper[4790]: I1011 10:46:34.905114 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/revision-pruner-10-master-0"] Oct 11 10:46:34.913989 master-0 kubenswrapper[4790]: W1011 10:46:34.913875 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode35e5ca9_d4d4_47f2_a2d0_217f9ac77ba3.slice/crio-c91bd4294e6142d5101911ef5ff984c42444eb6fa63a1f61f90d9fe739253af0 WatchSource:0}: Error finding container c91bd4294e6142d5101911ef5ff984c42444eb6fa63a1f61f90d9fe739253af0: Status 404 returned error can't find the container with id c91bd4294e6142d5101911ef5ff984c42444eb6fa63a1f61f90d9fe739253af0 Oct 11 10:46:35.292972 master-0 kubenswrapper[4790]: I1011 10:46:35.292878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-0" event={"ID":"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3","Type":"ContainerStarted","Data":"c91bd4294e6142d5101911ef5ff984c42444eb6fa63a1f61f90d9fe739253af0"} Oct 11 10:46:36.312543 master-0 kubenswrapper[4790]: I1011 10:46:36.312432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-0" event={"ID":"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3","Type":"ContainerStarted","Data":"403e2dc1bf2d947f244343a321e98557f5e484a29eac2d4b8168b223f45ad3d6"} Oct 11 10:46:36.343417 master-0 kubenswrapper[4790]: I1011 10:46:36.343293 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/revision-pruner-10-master-0" podStartSLOduration=2.343263893 podStartE2EDuration="2.343263893s" podCreationTimestamp="2025-10-11 10:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:46:36.341880745 +0000 UTC m=+472.896341047" watchObservedRunningTime="2025-10-11 10:46:36.343263893 +0000 UTC m=+472.897724225" Oct 11 10:46:37.323050 master-0 kubenswrapper[4790]: I1011 10:46:37.322958 4790 generic.go:334] "Generic (PLEG): container finished" podID="e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3" containerID="403e2dc1bf2d947f244343a321e98557f5e484a29eac2d4b8168b223f45ad3d6" exitCode=0 Oct 11 10:46:37.323050 master-0 kubenswrapper[4790]: I1011 10:46:37.323022 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-0" event={"ID":"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3","Type":"ContainerDied","Data":"403e2dc1bf2d947f244343a321e98557f5e484a29eac2d4b8168b223f45ad3d6"} Oct 11 10:46:38.729599 master-0 kubenswrapper[4790]: I1011 10:46:38.729511 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:38.895748 master-0 kubenswrapper[4790]: I1011 10:46:38.895646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kubelet-dir\") pod \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " Oct 11 10:46:38.896013 master-0 kubenswrapper[4790]: I1011 10:46:38.895766 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3" (UID: "e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:46:38.896013 master-0 kubenswrapper[4790]: I1011 10:46:38.895922 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kube-api-access\") pod \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\" (UID: \"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3\") " Oct 11 10:46:38.896821 master-0 kubenswrapper[4790]: I1011 10:46:38.896778 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:46:38.899917 master-0 kubenswrapper[4790]: I1011 10:46:38.899878 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3" (UID: "e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:46:38.997926 master-0 kubenswrapper[4790]: I1011 10:46:38.997798 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:46:39.339838 master-0 kubenswrapper[4790]: I1011 10:46:39.339755 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-0" event={"ID":"e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3","Type":"ContainerDied","Data":"c91bd4294e6142d5101911ef5ff984c42444eb6fa63a1f61f90d9fe739253af0"} Oct 11 10:46:39.339838 master-0 kubenswrapper[4790]: I1011 10:46:39.339825 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91bd4294e6142d5101911ef5ff984c42444eb6fa63a1f61f90d9fe739253af0" Oct 11 10:46:39.340320 master-0 kubenswrapper[4790]: I1011 10:46:39.339959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-0" Oct 11 10:46:43.472395 master-0 kubenswrapper[4790]: I1011 10:46:43.472278 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-10-master-0"] Oct 11 10:46:43.473410 master-0 kubenswrapper[4790]: E1011 10:46:43.472782 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3" containerName="pruner" Oct 11 10:46:43.473410 master-0 kubenswrapper[4790]: I1011 10:46:43.472810 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3" containerName="pruner" Oct 11 10:46:43.473410 master-0 kubenswrapper[4790]: I1011 10:46:43.472977 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3" containerName="pruner" Oct 11 10:46:43.473887 master-0 kubenswrapper[4790]: I1011 10:46:43.473839 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.476992 master-0 kubenswrapper[4790]: I1011 10:46:43.476929 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-xbqxb" Oct 11 10:46:43.489065 master-0 kubenswrapper[4790]: I1011 10:46:43.488974 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-10-master-0"] Oct 11 10:46:43.658902 master-0 kubenswrapper[4790]: I1011 10:46:43.658800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-kubelet-dir\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.658902 master-0 kubenswrapper[4790]: I1011 10:46:43.658906 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/527d9cd7-412d-4afb-9212-c8697426a964-kube-api-access\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.659257 master-0 kubenswrapper[4790]: I1011 10:46:43.659006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-var-lock\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.760588 master-0 kubenswrapper[4790]: I1011 10:46:43.760415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-var-lock\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.760588 master-0 kubenswrapper[4790]: I1011 10:46:43.760514 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-kubelet-dir\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.760588 master-0 kubenswrapper[4790]: I1011 10:46:43.760553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/527d9cd7-412d-4afb-9212-c8697426a964-kube-api-access\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.760918 master-0 kubenswrapper[4790]: I1011 10:46:43.760619 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-var-lock\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.760918 master-0 kubenswrapper[4790]: I1011 10:46:43.760730 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-kubelet-dir\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.783441 master-0 kubenswrapper[4790]: I1011 10:46:43.783386 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/527d9cd7-412d-4afb-9212-c8697426a964-kube-api-access\") pod \"installer-10-master-0\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:43.861698 master-0 kubenswrapper[4790]: I1011 10:46:43.861612 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-0" Oct 11 10:46:44.305232 master-0 kubenswrapper[4790]: I1011 10:46:44.305152 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-10-master-0"] Oct 11 10:46:44.369618 master-0 kubenswrapper[4790]: I1011 10:46:44.369540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-0" event={"ID":"527d9cd7-412d-4afb-9212-c8697426a964","Type":"ContainerStarted","Data":"0718367c3bdc698f2094bf8918ac19ddd76bb29db33497d019a8831490485d51"} Oct 11 10:46:45.377790 master-0 kubenswrapper[4790]: I1011 10:46:45.377649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-0" event={"ID":"527d9cd7-412d-4afb-9212-c8697426a964","Type":"ContainerStarted","Data":"84981853b8264575ef9774e4c0deccd1808b713d6d64a0dcb63fc54fcf80f561"} Oct 11 10:46:45.406833 master-0 kubenswrapper[4790]: I1011 10:46:45.406749 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-10-master-0" podStartSLOduration=2.406722992 podStartE2EDuration="2.406722992s" podCreationTimestamp="2025-10-11 10:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:46:45.405862938 +0000 UTC m=+481.960323260" watchObservedRunningTime="2025-10-11 10:46:45.406722992 +0000 UTC m=+481.961183284" Oct 11 10:47:15.831447 master-0 kubenswrapper[4790]: I1011 10:47:15.831260 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:47:15.832241 master-0 kubenswrapper[4790]: I1011 10:47:15.832009 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcdctl" containerID="cri-o://d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970" gracePeriod=30 Oct 11 10:47:15.832241 master-0 kubenswrapper[4790]: I1011 10:47:15.832081 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-rev" containerID="cri-o://3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060" gracePeriod=30 Oct 11 10:47:15.832241 master-0 kubenswrapper[4790]: I1011 10:47:15.832160 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" containerID="cri-o://b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca" gracePeriod=30 Oct 11 10:47:15.832241 master-0 kubenswrapper[4790]: I1011 10:47:15.832140 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-metrics" containerID="cri-o://95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377" gracePeriod=30 Oct 11 10:47:15.832375 master-0 kubenswrapper[4790]: I1011 10:47:15.832087 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-readyz" containerID="cri-o://e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef" gracePeriod=30 Oct 11 10:47:15.835076 master-0 kubenswrapper[4790]: I1011 10:47:15.835030 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:47:15.835311 master-0 kubenswrapper[4790]: E1011 10:47:15.835278 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.835350 master-0 kubenswrapper[4790]: I1011 10:47:15.835309 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.835350 master-0 kubenswrapper[4790]: E1011 10:47:15.835330 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.835350 master-0 kubenswrapper[4790]: I1011 10:47:15.835347 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: E1011 10:47:15.835362 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-rev" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: I1011 10:47:15.835377 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-rev" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: E1011 10:47:15.835404 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="setup" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: I1011 10:47:15.835417 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="setup" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: E1011 10:47:15.835434 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcdctl" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: I1011 10:47:15.835447 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcdctl" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: E1011 10:47:15.835462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-readyz" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: I1011 10:47:15.835475 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-readyz" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: E1011 10:47:15.835488 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-metrics" Oct 11 10:47:15.835499 master-0 kubenswrapper[4790]: I1011 10:47:15.835501 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-metrics" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: E1011 10:47:15.835518 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-ensure-env-vars" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835531 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-ensure-env-vars" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: E1011 10:47:15.835548 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-resources-copy" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835561 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-resources-copy" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835675 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-metrics" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835699 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-readyz" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835743 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcdctl" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835755 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835771 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd-rev" Oct 11 10:47:15.835788 master-0 kubenswrapper[4790]: I1011 10:47:15.835786 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.836058 master-0 kubenswrapper[4790]: I1011 10:47:15.835804 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.836058 master-0 kubenswrapper[4790]: E1011 10:47:15.835922 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:15.836058 master-0 kubenswrapper[4790]: I1011 10:47:15.835936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e53a8977ce5fc5588aef94f91dcc24" containerName="etcd" Oct 11 10:47:16.010035 master-0 kubenswrapper[4790]: I1011 10:47:16.009919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-cert-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.010035 master-0 kubenswrapper[4790]: I1011 10:47:16.009995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-static-pod-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.010035 master-0 kubenswrapper[4790]: I1011 10:47:16.010028 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-data-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.010035 master-0 kubenswrapper[4790]: I1011 10:47:16.010051 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-resource-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.010467 master-0 kubenswrapper[4790]: I1011 10:47:16.010072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-log-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.010467 master-0 kubenswrapper[4790]: I1011 10:47:16.010103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-usr-local-bin\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.111930 master-0 kubenswrapper[4790]: I1011 10:47:16.111740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-log-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.111930 master-0 kubenswrapper[4790]: I1011 10:47:16.111802 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-usr-local-bin\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.111930 master-0 kubenswrapper[4790]: I1011 10:47:16.111842 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-cert-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.111930 master-0 kubenswrapper[4790]: I1011 10:47:16.111866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-static-pod-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.111930 master-0 kubenswrapper[4790]: I1011 10:47:16.111895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-data-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.111930 master-0 kubenswrapper[4790]: I1011 10:47:16.111918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-resource-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.112333 master-0 kubenswrapper[4790]: I1011 10:47:16.111980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-log-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.112333 master-0 kubenswrapper[4790]: I1011 10:47:16.112030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-data-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.112333 master-0 kubenswrapper[4790]: I1011 10:47:16.112004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-resource-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.112333 master-0 kubenswrapper[4790]: I1011 10:47:16.112037 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-cert-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.112333 master-0 kubenswrapper[4790]: I1011 10:47:16.112080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-static-pod-dir\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.112333 master-0 kubenswrapper[4790]: I1011 10:47:16.112180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/14286286be88b59efc7cfc15eca1cc38-usr-local-bin\") pod \"etcd-master-0\" (UID: \"14286286be88b59efc7cfc15eca1cc38\") " pod="openshift-etcd/etcd-master-0" Oct 11 10:47:16.559518 master-0 kubenswrapper[4790]: I1011 10:47:16.559458 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/1.log" Oct 11 10:47:16.560176 master-0 kubenswrapper[4790]: I1011 10:47:16.560140 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd-rev/0.log" Oct 11 10:47:16.561216 master-0 kubenswrapper[4790]: I1011 10:47:16.561190 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd-metrics/0.log" Oct 11 10:47:16.562909 master-0 kubenswrapper[4790]: I1011 10:47:16.562871 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060" exitCode=2 Oct 11 10:47:16.562909 master-0 kubenswrapper[4790]: I1011 10:47:16.562905 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef" exitCode=0 Oct 11 10:47:16.563046 master-0 kubenswrapper[4790]: I1011 10:47:16.562917 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377" exitCode=2 Oct 11 10:47:16.565090 master-0 kubenswrapper[4790]: I1011 10:47:16.565059 4790 generic.go:334] "Generic (PLEG): container finished" podID="527d9cd7-412d-4afb-9212-c8697426a964" containerID="84981853b8264575ef9774e4c0deccd1808b713d6d64a0dcb63fc54fcf80f561" exitCode=0 Oct 11 10:47:16.565171 master-0 kubenswrapper[4790]: I1011 10:47:16.565098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-0" event={"ID":"527d9cd7-412d-4afb-9212-c8697426a964","Type":"ContainerDied","Data":"84981853b8264575ef9774e4c0deccd1808b713d6d64a0dcb63fc54fcf80f561"} Oct 11 10:47:16.833167 master-0 kubenswrapper[4790]: I1011 10:47:16.832992 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:16.833167 master-0 kubenswrapper[4790]: I1011 10:47:16.833096 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:17.861813 master-0 kubenswrapper[4790]: I1011 10:47:17.861743 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-0" Oct 11 10:47:18.034296 master-0 kubenswrapper[4790]: I1011 10:47:18.034233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-kubelet-dir\") pod \"527d9cd7-412d-4afb-9212-c8697426a964\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " Oct 11 10:47:18.034296 master-0 kubenswrapper[4790]: I1011 10:47:18.034297 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-var-lock\") pod \"527d9cd7-412d-4afb-9212-c8697426a964\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " Oct 11 10:47:18.034634 master-0 kubenswrapper[4790]: I1011 10:47:18.034346 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/527d9cd7-412d-4afb-9212-c8697426a964-kube-api-access\") pod \"527d9cd7-412d-4afb-9212-c8697426a964\" (UID: \"527d9cd7-412d-4afb-9212-c8697426a964\") " Oct 11 10:47:18.034634 master-0 kubenswrapper[4790]: I1011 10:47:18.034403 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "527d9cd7-412d-4afb-9212-c8697426a964" (UID: "527d9cd7-412d-4afb-9212-c8697426a964"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:18.034634 master-0 kubenswrapper[4790]: I1011 10:47:18.034485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-var-lock" (OuterVolumeSpecName: "var-lock") pod "527d9cd7-412d-4afb-9212-c8697426a964" (UID: "527d9cd7-412d-4afb-9212-c8697426a964"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:18.034634 master-0 kubenswrapper[4790]: I1011 10:47:18.034594 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:18.034634 master-0 kubenswrapper[4790]: I1011 10:47:18.034614 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/527d9cd7-412d-4afb-9212-c8697426a964-var-lock\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:18.038019 master-0 kubenswrapper[4790]: I1011 10:47:18.037961 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527d9cd7-412d-4afb-9212-c8697426a964-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "527d9cd7-412d-4afb-9212-c8697426a964" (UID: "527d9cd7-412d-4afb-9212-c8697426a964"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:47:18.135906 master-0 kubenswrapper[4790]: I1011 10:47:18.135743 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/527d9cd7-412d-4afb-9212-c8697426a964-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:18.578020 master-0 kubenswrapper[4790]: I1011 10:47:18.577908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-0" event={"ID":"527d9cd7-412d-4afb-9212-c8697426a964","Type":"ContainerDied","Data":"0718367c3bdc698f2094bf8918ac19ddd76bb29db33497d019a8831490485d51"} Oct 11 10:47:18.578020 master-0 kubenswrapper[4790]: I1011 10:47:18.578018 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0718367c3bdc698f2094bf8918ac19ddd76bb29db33497d019a8831490485d51" Oct 11 10:47:18.578825 master-0 kubenswrapper[4790]: I1011 10:47:18.578774 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-0" Oct 11 10:47:21.243258 master-0 kubenswrapper[4790]: I1011 10:47:21.243181 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-7f4f89bcdb-rh9fx"] Oct 11 10:47:21.244130 master-0 kubenswrapper[4790]: E1011 10:47:21.243390 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527d9cd7-412d-4afb-9212-c8697426a964" containerName="installer" Oct 11 10:47:21.244130 master-0 kubenswrapper[4790]: I1011 10:47:21.243406 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="527d9cd7-412d-4afb-9212-c8697426a964" containerName="installer" Oct 11 10:47:21.244130 master-0 kubenswrapper[4790]: I1011 10:47:21.243535 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="527d9cd7-412d-4afb-9212-c8697426a964" containerName="installer" Oct 11 10:47:21.244130 master-0 kubenswrapper[4790]: I1011 10:47:21.244132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.247090 master-0 kubenswrapper[4790]: I1011 10:47:21.247029 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Oct 11 10:47:21.247374 master-0 kubenswrapper[4790]: I1011 10:47:21.247346 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Oct 11 10:47:21.247815 master-0 kubenswrapper[4790]: I1011 10:47:21.247788 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Oct 11 10:47:21.248015 master-0 kubenswrapper[4790]: I1011 10:47:21.247988 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Oct 11 10:47:21.248147 master-0 kubenswrapper[4790]: I1011 10:47:21.248124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Oct 11 10:47:21.265394 master-0 kubenswrapper[4790]: I1011 10:47:21.265329 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-7f4f89bcdb-rh9fx"] Oct 11 10:47:21.392856 master-0 kubenswrapper[4790]: I1011 10:47:21.392775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-webhook-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.393146 master-0 kubenswrapper[4790]: I1011 10:47:21.392926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/504ab58a-33b6-400f-8f3f-8ed6be984915-socket-dir\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.393146 master-0 kubenswrapper[4790]: I1011 10:47:21.392963 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-metrics-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.393263 master-0 kubenswrapper[4790]: I1011 10:47:21.393150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-apiservice-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.393263 master-0 kubenswrapper[4790]: I1011 10:47:21.393186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zv8h\" (UniqueName: \"kubernetes.io/projected/504ab58a-33b6-400f-8f3f-8ed6be984915-kube-api-access-8zv8h\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.495252 master-0 kubenswrapper[4790]: I1011 10:47:21.495069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/504ab58a-33b6-400f-8f3f-8ed6be984915-socket-dir\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.495252 master-0 kubenswrapper[4790]: I1011 10:47:21.495162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-metrics-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.495252 master-0 kubenswrapper[4790]: I1011 10:47:21.495203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-apiservice-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.495252 master-0 kubenswrapper[4790]: I1011 10:47:21.495241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zv8h\" (UniqueName: \"kubernetes.io/projected/504ab58a-33b6-400f-8f3f-8ed6be984915-kube-api-access-8zv8h\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.495688 master-0 kubenswrapper[4790]: I1011 10:47:21.495322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-webhook-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.495930 master-0 kubenswrapper[4790]: I1011 10:47:21.495857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/504ab58a-33b6-400f-8f3f-8ed6be984915-socket-dir\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.499591 master-0 kubenswrapper[4790]: I1011 10:47:21.499533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-webhook-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.500453 master-0 kubenswrapper[4790]: I1011 10:47:21.500390 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-apiservice-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.501268 master-0 kubenswrapper[4790]: I1011 10:47:21.501223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/504ab58a-33b6-400f-8f3f-8ed6be984915-metrics-cert\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.525825 master-0 kubenswrapper[4790]: I1011 10:47:21.525766 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zv8h\" (UniqueName: \"kubernetes.io/projected/504ab58a-33b6-400f-8f3f-8ed6be984915-kube-api-access-8zv8h\") pod \"lvms-operator-7f4f89bcdb-rh9fx\" (UID: \"504ab58a-33b6-400f-8f3f-8ed6be984915\") " pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.559929 master-0 kubenswrapper[4790]: I1011 10:47:21.559859 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:21.833383 master-0 kubenswrapper[4790]: I1011 10:47:21.833318 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:21.833383 master-0 kubenswrapper[4790]: I1011 10:47:21.833402 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:22.003471 master-0 kubenswrapper[4790]: I1011 10:47:22.003393 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-7f4f89bcdb-rh9fx"] Oct 11 10:47:22.013128 master-0 kubenswrapper[4790]: W1011 10:47:22.013066 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504ab58a_33b6_400f_8f3f_8ed6be984915.slice/crio-a9aa1d1d269ae6bc7d8fbd857f1784dbc7b4e717ec2163fa271a0d7c15fae2dd WatchSource:0}: Error finding container a9aa1d1d269ae6bc7d8fbd857f1784dbc7b4e717ec2163fa271a0d7c15fae2dd: Status 404 returned error can't find the container with id a9aa1d1d269ae6bc7d8fbd857f1784dbc7b4e717ec2163fa271a0d7c15fae2dd Oct 11 10:47:22.018192 master-0 kubenswrapper[4790]: I1011 10:47:22.018152 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 10:47:22.604690 master-0 kubenswrapper[4790]: I1011 10:47:22.604599 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" event={"ID":"504ab58a-33b6-400f-8f3f-8ed6be984915","Type":"ContainerStarted","Data":"a9aa1d1d269ae6bc7d8fbd857f1784dbc7b4e717ec2163fa271a0d7c15fae2dd"} Oct 11 10:47:26.833098 master-0 kubenswrapper[4790]: I1011 10:47:26.833002 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:26.833903 master-0 kubenswrapper[4790]: I1011 10:47:26.833125 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:26.833903 master-0 kubenswrapper[4790]: I1011 10:47:26.833234 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:47:26.834387 master-0 kubenswrapper[4790]: I1011 10:47:26.834309 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:26.834496 master-0 kubenswrapper[4790]: I1011 10:47:26.834436 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:27.632018 master-0 kubenswrapper[4790]: I1011 10:47:27.631767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" event={"ID":"504ab58a-33b6-400f-8f3f-8ed6be984915","Type":"ContainerStarted","Data":"a6701b274e8f5e138b58ffe2d1c3a1b4ca33b4650d0b25acda5098cb29b36a5b"} Oct 11 10:47:27.632018 master-0 kubenswrapper[4790]: I1011 10:47:27.632013 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:27.669532 master-0 kubenswrapper[4790]: I1011 10:47:27.669404 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" podStartSLOduration=1.3791847640000001 podStartE2EDuration="6.669365413s" podCreationTimestamp="2025-10-11 10:47:21 +0000 UTC" firstStartedPulling="2025-10-11 10:47:22.018033055 +0000 UTC m=+518.572493347" lastFinishedPulling="2025-10-11 10:47:27.308213664 +0000 UTC m=+523.862673996" observedRunningTime="2025-10-11 10:47:27.664780608 +0000 UTC m=+524.219240950" watchObservedRunningTime="2025-10-11 10:47:27.669365413 +0000 UTC m=+524.223825745" Oct 11 10:47:28.642276 master-0 kubenswrapper[4790]: I1011 10:47:28.642151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-7f4f89bcdb-rh9fx" Oct 11 10:47:31.832887 master-0 kubenswrapper[4790]: I1011 10:47:31.832760 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:31.833529 master-0 kubenswrapper[4790]: I1011 10:47:31.832912 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:36.833243 master-0 kubenswrapper[4790]: I1011 10:47:36.833117 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:36.833243 master-0 kubenswrapper[4790]: I1011 10:47:36.833220 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:41.833728 master-0 kubenswrapper[4790]: I1011 10:47:41.833616 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:41.834514 master-0 kubenswrapper[4790]: I1011 10:47:41.833787 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:46.405046 master-0 kubenswrapper[4790]: I1011 10:47:46.404983 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/2.log" Oct 11 10:47:46.405782 master-0 kubenswrapper[4790]: I1011 10:47:46.405412 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/1.log" Oct 11 10:47:46.406274 master-0 kubenswrapper[4790]: I1011 10:47:46.406240 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd-rev/0.log" Oct 11 10:47:46.407512 master-0 kubenswrapper[4790]: I1011 10:47:46.407473 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd-metrics/0.log" Oct 11 10:47:46.408047 master-0 kubenswrapper[4790]: I1011 10:47:46.408005 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcdctl/0.log" Oct 11 10:47:46.409535 master-0 kubenswrapper[4790]: I1011 10:47:46.409508 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Oct 11 10:47:46.415805 master-0 kubenswrapper[4790]: I1011 10:47:46.415681 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-0" oldPodUID="a7e53a8977ce5fc5588aef94f91dcc24" podUID="14286286be88b59efc7cfc15eca1cc38" Oct 11 10:47:46.538822 master-0 kubenswrapper[4790]: I1011 10:47:46.538734 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-usr-local-bin\") pod \"a7e53a8977ce5fc5588aef94f91dcc24\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.538859 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-cert-dir\") pod \"a7e53a8977ce5fc5588aef94f91dcc24\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.538881 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "a7e53a8977ce5fc5588aef94f91dcc24" (UID: "a7e53a8977ce5fc5588aef94f91dcc24"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.538941 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-resource-dir\") pod \"a7e53a8977ce5fc5588aef94f91dcc24\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.538982 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "a7e53a8977ce5fc5588aef94f91dcc24" (UID: "a7e53a8977ce5fc5588aef94f91dcc24"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.539015 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-data-dir\") pod \"a7e53a8977ce5fc5588aef94f91dcc24\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.539091 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-log-dir\") pod \"a7e53a8977ce5fc5588aef94f91dcc24\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.539129 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-data-dir" (OuterVolumeSpecName: "data-dir") pod "a7e53a8977ce5fc5588aef94f91dcc24" (UID: "a7e53a8977ce5fc5588aef94f91dcc24"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:46.539167 master-0 kubenswrapper[4790]: I1011 10:47:46.539176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-static-pod-dir\") pod \"a7e53a8977ce5fc5588aef94f91dcc24\" (UID: \"a7e53a8977ce5fc5588aef94f91dcc24\") " Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a7e53a8977ce5fc5588aef94f91dcc24" (UID: "a7e53a8977ce5fc5588aef94f91dcc24"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-log-dir" (OuterVolumeSpecName: "log-dir") pod "a7e53a8977ce5fc5588aef94f91dcc24" (UID: "a7e53a8977ce5fc5588aef94f91dcc24"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539308 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "a7e53a8977ce5fc5588aef94f91dcc24" (UID: "a7e53a8977ce5fc5588aef94f91dcc24"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539529 4790 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539561 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-cert-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539579 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-resource-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539596 4790 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-data-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539615 4790 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-log-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:46.539807 master-0 kubenswrapper[4790]: I1011 10:47:46.539631 4790 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/a7e53a8977ce5fc5588aef94f91dcc24-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:47:46.741687 master-0 kubenswrapper[4790]: I1011 10:47:46.741502 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/2.log" Oct 11 10:47:46.742393 master-0 kubenswrapper[4790]: I1011 10:47:46.742333 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd/1.log" Oct 11 10:47:46.743449 master-0 kubenswrapper[4790]: I1011 10:47:46.743383 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd-rev/0.log" Oct 11 10:47:46.745067 master-0 kubenswrapper[4790]: I1011 10:47:46.745012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcd-metrics/0.log" Oct 11 10:47:46.745670 master-0 kubenswrapper[4790]: I1011 10:47:46.745620 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_a7e53a8977ce5fc5588aef94f91dcc24/etcdctl/0.log" Oct 11 10:47:46.747232 master-0 kubenswrapper[4790]: I1011 10:47:46.747158 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca" exitCode=137 Oct 11 10:47:46.747232 master-0 kubenswrapper[4790]: I1011 10:47:46.747211 4790 generic.go:334] "Generic (PLEG): container finished" podID="a7e53a8977ce5fc5588aef94f91dcc24" containerID="d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970" exitCode=137 Oct 11 10:47:46.747412 master-0 kubenswrapper[4790]: I1011 10:47:46.747282 4790 scope.go:117] "RemoveContainer" containerID="b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca" Oct 11 10:47:46.747412 master-0 kubenswrapper[4790]: I1011 10:47:46.747315 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Oct 11 10:47:46.754875 master-0 kubenswrapper[4790]: I1011 10:47:46.754752 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-0" oldPodUID="a7e53a8977ce5fc5588aef94f91dcc24" podUID="14286286be88b59efc7cfc15eca1cc38" Oct 11 10:47:46.775305 master-0 kubenswrapper[4790]: I1011 10:47:46.775239 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:47:46.787266 master-0 kubenswrapper[4790]: I1011 10:47:46.787183 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-0" oldPodUID="a7e53a8977ce5fc5588aef94f91dcc24" podUID="14286286be88b59efc7cfc15eca1cc38" Oct 11 10:47:46.811613 master-0 kubenswrapper[4790]: I1011 10:47:46.811482 4790 scope.go:117] "RemoveContainer" containerID="3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060" Oct 11 10:47:46.833783 master-0 kubenswrapper[4790]: I1011 10:47:46.833044 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:46.833783 master-0 kubenswrapper[4790]: I1011 10:47:46.833134 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:46.842043 master-0 kubenswrapper[4790]: I1011 10:47:46.841961 4790 scope.go:117] "RemoveContainer" containerID="e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef" Oct 11 10:47:46.858630 master-0 kubenswrapper[4790]: I1011 10:47:46.858540 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf"] Oct 11 10:47:46.859919 master-0 kubenswrapper[4790]: I1011 10:47:46.859872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" Oct 11 10:47:46.864405 master-0 kubenswrapper[4790]: I1011 10:47:46.864335 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 11 10:47:46.864683 master-0 kubenswrapper[4790]: I1011 10:47:46.864645 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 11 10:47:46.865411 master-0 kubenswrapper[4790]: I1011 10:47:46.865363 4790 scope.go:117] "RemoveContainer" containerID="95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377" Oct 11 10:47:46.883935 master-0 kubenswrapper[4790]: I1011 10:47:46.883880 4790 scope.go:117] "RemoveContainer" containerID="d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970" Oct 11 10:47:46.899791 master-0 kubenswrapper[4790]: I1011 10:47:46.899665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf"] Oct 11 10:47:46.907052 master-0 kubenswrapper[4790]: I1011 10:47:46.906599 4790 scope.go:117] "RemoveContainer" containerID="033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0" Oct 11 10:47:46.937853 master-0 kubenswrapper[4790]: I1011 10:47:46.937805 4790 scope.go:117] "RemoveContainer" containerID="57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f" Oct 11 10:47:46.976955 master-0 kubenswrapper[4790]: I1011 10:47:46.976747 4790 scope.go:117] "RemoveContainer" containerID="0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796" Oct 11 10:47:47.000341 master-0 kubenswrapper[4790]: I1011 10:47:47.000182 4790 scope.go:117] "RemoveContainer" containerID="b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca" Oct 11 10:47:47.001058 master-0 kubenswrapper[4790]: E1011 10:47:47.001009 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca\": container with ID starting with b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca not found: ID does not exist" containerID="b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca" Oct 11 10:47:47.001123 master-0 kubenswrapper[4790]: I1011 10:47:47.001078 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca"} err="failed to get container status \"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca\": rpc error: code = NotFound desc = could not find container \"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca\": container with ID starting with b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca not found: ID does not exist" Oct 11 10:47:47.001123 master-0 kubenswrapper[4790]: I1011 10:47:47.001118 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:47:47.001579 master-0 kubenswrapper[4790]: E1011 10:47:47.001556 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227\": container with ID starting with 31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227 not found: ID does not exist" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:47:47.001674 master-0 kubenswrapper[4790]: I1011 10:47:47.001655 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227"} err="failed to get container status \"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227\": rpc error: code = NotFound desc = could not find container \"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227\": container with ID starting with 31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227 not found: ID does not exist" Oct 11 10:47:47.001778 master-0 kubenswrapper[4790]: I1011 10:47:47.001765 4790 scope.go:117] "RemoveContainer" containerID="3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060" Oct 11 10:47:47.002133 master-0 kubenswrapper[4790]: E1011 10:47:47.002117 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060\": container with ID starting with 3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060 not found: ID does not exist" containerID="3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060" Oct 11 10:47:47.002217 master-0 kubenswrapper[4790]: I1011 10:47:47.002201 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060"} err="failed to get container status \"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060\": rpc error: code = NotFound desc = could not find container \"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060\": container with ID starting with 3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060 not found: ID does not exist" Oct 11 10:47:47.002282 master-0 kubenswrapper[4790]: I1011 10:47:47.002272 4790 scope.go:117] "RemoveContainer" containerID="e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef" Oct 11 10:47:47.003016 master-0 kubenswrapper[4790]: E1011 10:47:47.003000 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef\": container with ID starting with e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef not found: ID does not exist" containerID="e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef" Oct 11 10:47:47.003106 master-0 kubenswrapper[4790]: I1011 10:47:47.003088 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef"} err="failed to get container status \"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef\": rpc error: code = NotFound desc = could not find container \"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef\": container with ID starting with e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef not found: ID does not exist" Oct 11 10:47:47.003172 master-0 kubenswrapper[4790]: I1011 10:47:47.003158 4790 scope.go:117] "RemoveContainer" containerID="95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377" Oct 11 10:47:47.003600 master-0 kubenswrapper[4790]: E1011 10:47:47.003585 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377\": container with ID starting with 95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377 not found: ID does not exist" containerID="95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377" Oct 11 10:47:47.003797 master-0 kubenswrapper[4790]: I1011 10:47:47.003780 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377"} err="failed to get container status \"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377\": rpc error: code = NotFound desc = could not find container \"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377\": container with ID starting with 95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377 not found: ID does not exist" Oct 11 10:47:47.003864 master-0 kubenswrapper[4790]: I1011 10:47:47.003852 4790 scope.go:117] "RemoveContainer" containerID="d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970" Oct 11 10:47:47.004387 master-0 kubenswrapper[4790]: E1011 10:47:47.004345 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970\": container with ID starting with d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970 not found: ID does not exist" containerID="d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970" Oct 11 10:47:47.004472 master-0 kubenswrapper[4790]: I1011 10:47:47.004454 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970"} err="failed to get container status \"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970\": rpc error: code = NotFound desc = could not find container \"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970\": container with ID starting with d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970 not found: ID does not exist" Oct 11 10:47:47.004535 master-0 kubenswrapper[4790]: I1011 10:47:47.004524 4790 scope.go:117] "RemoveContainer" containerID="033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0" Oct 11 10:47:47.004851 master-0 kubenswrapper[4790]: E1011 10:47:47.004836 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0\": container with ID starting with 033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0 not found: ID does not exist" containerID="033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0" Oct 11 10:47:47.005032 master-0 kubenswrapper[4790]: I1011 10:47:47.005014 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0"} err="failed to get container status \"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0\": rpc error: code = NotFound desc = could not find container \"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0\": container with ID starting with 033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0 not found: ID does not exist" Oct 11 10:47:47.005102 master-0 kubenswrapper[4790]: I1011 10:47:47.005092 4790 scope.go:117] "RemoveContainer" containerID="57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f" Oct 11 10:47:47.005505 master-0 kubenswrapper[4790]: E1011 10:47:47.005491 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f\": container with ID starting with 57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f not found: ID does not exist" containerID="57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f" Oct 11 10:47:47.005591 master-0 kubenswrapper[4790]: I1011 10:47:47.005572 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f"} err="failed to get container status \"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f\": rpc error: code = NotFound desc = could not find container \"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f\": container with ID starting with 57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f not found: ID does not exist" Oct 11 10:47:47.005659 master-0 kubenswrapper[4790]: I1011 10:47:47.005645 4790 scope.go:117] "RemoveContainer" containerID="0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796" Oct 11 10:47:47.006063 master-0 kubenswrapper[4790]: E1011 10:47:47.006043 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796\": container with ID starting with 0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796 not found: ID does not exist" containerID="0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796" Oct 11 10:47:47.006151 master-0 kubenswrapper[4790]: I1011 10:47:47.006135 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796"} err="failed to get container status \"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796\": rpc error: code = NotFound desc = could not find container \"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796\": container with ID starting with 0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796 not found: ID does not exist" Oct 11 10:47:47.006216 master-0 kubenswrapper[4790]: I1011 10:47:47.006206 4790 scope.go:117] "RemoveContainer" containerID="b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca" Oct 11 10:47:47.006547 master-0 kubenswrapper[4790]: I1011 10:47:47.006531 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca"} err="failed to get container status \"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca\": rpc error: code = NotFound desc = could not find container \"b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca\": container with ID starting with b73aa8db7e53e83a1a9430aab15d1a11f28139414df55dce80c436d48922f6ca not found: ID does not exist" Oct 11 10:47:47.006658 master-0 kubenswrapper[4790]: I1011 10:47:47.006644 4790 scope.go:117] "RemoveContainer" containerID="31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227" Oct 11 10:47:47.008651 master-0 kubenswrapper[4790]: I1011 10:47:47.008634 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227"} err="failed to get container status \"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227\": rpc error: code = NotFound desc = could not find container \"31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227\": container with ID starting with 31fd9a1eb8f7baffc7dd0dc0c3438ec3345e4e70ffaf5fbc351f82b6c2165227 not found: ID does not exist" Oct 11 10:47:47.008753 master-0 kubenswrapper[4790]: I1011 10:47:47.008742 4790 scope.go:117] "RemoveContainer" containerID="3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060" Oct 11 10:47:47.009202 master-0 kubenswrapper[4790]: I1011 10:47:47.009147 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060"} err="failed to get container status \"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060\": rpc error: code = NotFound desc = could not find container \"3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060\": container with ID starting with 3581fc03c1130dc2dfacc8a0d155544ce350f22134d68a0cd6e79314b24ed060 not found: ID does not exist" Oct 11 10:47:47.009282 master-0 kubenswrapper[4790]: I1011 10:47:47.009270 4790 scope.go:117] "RemoveContainer" containerID="e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef" Oct 11 10:47:47.009655 master-0 kubenswrapper[4790]: I1011 10:47:47.009638 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef"} err="failed to get container status \"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef\": rpc error: code = NotFound desc = could not find container \"e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef\": container with ID starting with e4fe8f20eab6d0f57dd7ff0348d7d79afb0c0c3e2753d58dd10a19dafe7b60ef not found: ID does not exist" Oct 11 10:47:47.009782 master-0 kubenswrapper[4790]: I1011 10:47:47.009762 4790 scope.go:117] "RemoveContainer" containerID="95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377" Oct 11 10:47:47.010236 master-0 kubenswrapper[4790]: I1011 10:47:47.010218 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377"} err="failed to get container status \"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377\": rpc error: code = NotFound desc = could not find container \"95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377\": container with ID starting with 95fc9c6d467ef4be49d4e412e32d764ddb12e42ab41ba14dd9259747220b1377 not found: ID does not exist" Oct 11 10:47:47.010369 master-0 kubenswrapper[4790]: I1011 10:47:47.010356 4790 scope.go:117] "RemoveContainer" containerID="d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970" Oct 11 10:47:47.010806 master-0 kubenswrapper[4790]: I1011 10:47:47.010780 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970"} err="failed to get container status \"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970\": rpc error: code = NotFound desc = could not find container \"d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970\": container with ID starting with d5626e357f9cecab4c479d110f8d9828e354694d4906d68734ca04db0e170970 not found: ID does not exist" Oct 11 10:47:47.010923 master-0 kubenswrapper[4790]: I1011 10:47:47.010877 4790 scope.go:117] "RemoveContainer" containerID="033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0" Oct 11 10:47:47.011311 master-0 kubenswrapper[4790]: I1011 10:47:47.011290 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0"} err="failed to get container status \"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0\": rpc error: code = NotFound desc = could not find container \"033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0\": container with ID starting with 033430f1d3826d181c0b75fb8343feb31c60c650260f0caebe6a290f1c9deca0 not found: ID does not exist" Oct 11 10:47:47.011448 master-0 kubenswrapper[4790]: I1011 10:47:47.011426 4790 scope.go:117] "RemoveContainer" containerID="57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f" Oct 11 10:47:47.012061 master-0 kubenswrapper[4790]: I1011 10:47:47.012038 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f"} err="failed to get container status \"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f\": rpc error: code = NotFound desc = could not find container \"57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f\": container with ID starting with 57c32f6a37a34d84078aadb245675fb641cce16a04adbde4e12fbce6a920df2f not found: ID does not exist" Oct 11 10:47:47.012169 master-0 kubenswrapper[4790]: I1011 10:47:47.012152 4790 scope.go:117] "RemoveContainer" containerID="0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796" Oct 11 10:47:47.012812 master-0 kubenswrapper[4790]: I1011 10:47:47.012793 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796"} err="failed to get container status \"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796\": rpc error: code = NotFound desc = could not find container \"0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796\": container with ID starting with 0b424e04233179f6773559ee329a6da085bcdbf1e83437befa1c0cee6d727796 not found: ID does not exist" Oct 11 10:47:47.044767 master-0 kubenswrapper[4790]: I1011 10:47:47.044667 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zzd\" (UniqueName: \"kubernetes.io/projected/2086cc9e-bd35-4e52-94aa-25d3e140537f-kube-api-access-26zzd\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2chzf\" (UID: \"2086cc9e-bd35-4e52-94aa-25d3e140537f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" Oct 11 10:47:47.148371 master-0 kubenswrapper[4790]: I1011 10:47:47.146857 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26zzd\" (UniqueName: \"kubernetes.io/projected/2086cc9e-bd35-4e52-94aa-25d3e140537f-kube-api-access-26zzd\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2chzf\" (UID: \"2086cc9e-bd35-4e52-94aa-25d3e140537f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" Oct 11 10:47:47.175662 master-0 kubenswrapper[4790]: I1011 10:47:47.175594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zzd\" (UniqueName: \"kubernetes.io/projected/2086cc9e-bd35-4e52-94aa-25d3e140537f-kube-api-access-26zzd\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2chzf\" (UID: \"2086cc9e-bd35-4e52-94aa-25d3e140537f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" Oct 11 10:47:47.184121 master-0 kubenswrapper[4790]: I1011 10:47:47.184069 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" Oct 11 10:47:47.593327 master-0 kubenswrapper[4790]: I1011 10:47:47.593230 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf"] Oct 11 10:47:47.602957 master-0 kubenswrapper[4790]: W1011 10:47:47.602873 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2086cc9e_bd35_4e52_94aa_25d3e140537f.slice/crio-b79f15327cd61b03ec5d22756d0f8cccda1ea150b36305e1cd7b4f98efd84d79 WatchSource:0}: Error finding container b79f15327cd61b03ec5d22756d0f8cccda1ea150b36305e1cd7b4f98efd84d79: Status 404 returned error can't find the container with id b79f15327cd61b03ec5d22756d0f8cccda1ea150b36305e1cd7b4f98efd84d79 Oct 11 10:47:47.757453 master-0 kubenswrapper[4790]: I1011 10:47:47.757371 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" event={"ID":"2086cc9e-bd35-4e52-94aa-25d3e140537f","Type":"ContainerStarted","Data":"b79f15327cd61b03ec5d22756d0f8cccda1ea150b36305e1cd7b4f98efd84d79"} Oct 11 10:47:48.306687 master-0 kubenswrapper[4790]: I1011 10:47:48.306603 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e53a8977ce5fc5588aef94f91dcc24" path="/var/lib/kubelet/pods/a7e53a8977ce5fc5588aef94f91dcc24/volumes" Oct 11 10:47:51.833137 master-0 kubenswrapper[4790]: I1011 10:47:51.833062 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:51.833830 master-0 kubenswrapper[4790]: I1011 10:47:51.833253 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:53.797669 master-0 kubenswrapper[4790]: I1011 10:47:53.797578 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" event={"ID":"2086cc9e-bd35-4e52-94aa-25d3e140537f","Type":"ContainerStarted","Data":"75da7807698c214f672031efcf5a3b337d41563ad51027bb40f51470996ac593"} Oct 11 10:47:53.833125 master-0 kubenswrapper[4790]: I1011 10:47:53.832981 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2chzf" podStartSLOduration=2.36817931 podStartE2EDuration="7.832948416s" podCreationTimestamp="2025-10-11 10:47:46 +0000 UTC" firstStartedPulling="2025-10-11 10:47:47.607412444 +0000 UTC m=+544.161872736" lastFinishedPulling="2025-10-11 10:47:53.07218155 +0000 UTC m=+549.626641842" observedRunningTime="2025-10-11 10:47:53.830754188 +0000 UTC m=+550.385214500" watchObservedRunningTime="2025-10-11 10:47:53.832948416 +0000 UTC m=+550.387408738" Oct 11 10:47:55.881968 master-0 kubenswrapper[4790]: I1011 10:47:55.881921 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-nb76r"] Oct 11 10:47:55.883294 master-0 kubenswrapper[4790]: I1011 10:47:55.883276 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:55.888585 master-0 kubenswrapper[4790]: I1011 10:47:55.888535 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 11 10:47:55.889111 master-0 kubenswrapper[4790]: I1011 10:47:55.889078 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 11 10:47:55.916982 master-0 kubenswrapper[4790]: I1011 10:47:55.916918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-nb76r"] Oct 11 10:47:55.960735 master-0 kubenswrapper[4790]: I1011 10:47:55.959767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9abe9cfa-95f8-4a08-bbc2-27776956894d-bound-sa-token\") pod \"cert-manager-webhook-d969966f-nb76r\" (UID: \"9abe9cfa-95f8-4a08-bbc2-27776956894d\") " pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:55.960735 master-0 kubenswrapper[4790]: I1011 10:47:55.959849 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcfk\" (UniqueName: \"kubernetes.io/projected/9abe9cfa-95f8-4a08-bbc2-27776956894d-kube-api-access-qfcfk\") pod \"cert-manager-webhook-d969966f-nb76r\" (UID: \"9abe9cfa-95f8-4a08-bbc2-27776956894d\") " pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:56.060751 master-0 kubenswrapper[4790]: I1011 10:47:56.060664 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcfk\" (UniqueName: \"kubernetes.io/projected/9abe9cfa-95f8-4a08-bbc2-27776956894d-kube-api-access-qfcfk\") pod \"cert-manager-webhook-d969966f-nb76r\" (UID: \"9abe9cfa-95f8-4a08-bbc2-27776956894d\") " pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:56.061099 master-0 kubenswrapper[4790]: I1011 10:47:56.060806 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9abe9cfa-95f8-4a08-bbc2-27776956894d-bound-sa-token\") pod \"cert-manager-webhook-d969966f-nb76r\" (UID: \"9abe9cfa-95f8-4a08-bbc2-27776956894d\") " pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:56.082082 master-0 kubenswrapper[4790]: I1011 10:47:56.082030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9abe9cfa-95f8-4a08-bbc2-27776956894d-bound-sa-token\") pod \"cert-manager-webhook-d969966f-nb76r\" (UID: \"9abe9cfa-95f8-4a08-bbc2-27776956894d\") " pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:56.091111 master-0 kubenswrapper[4790]: I1011 10:47:56.091034 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcfk\" (UniqueName: \"kubernetes.io/projected/9abe9cfa-95f8-4a08-bbc2-27776956894d-kube-api-access-qfcfk\") pod \"cert-manager-webhook-d969966f-nb76r\" (UID: \"9abe9cfa-95f8-4a08-bbc2-27776956894d\") " pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:56.207319 master-0 kubenswrapper[4790]: I1011 10:47:56.207134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:47:56.292816 master-0 kubenswrapper[4790]: I1011 10:47:56.292340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Oct 11 10:47:56.320580 master-0 kubenswrapper[4790]: I1011 10:47:56.320492 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a13f92a6-018a-40b2-bc65-890f74a263cf" Oct 11 10:47:56.320580 master-0 kubenswrapper[4790]: I1011 10:47:56.320563 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a13f92a6-018a-40b2-bc65-890f74a263cf" Oct 11 10:47:56.354304 master-0 kubenswrapper[4790]: I1011 10:47:56.349102 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:47:56.354304 master-0 kubenswrapper[4790]: I1011 10:47:56.353388 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:47:56.354304 master-0 kubenswrapper[4790]: I1011 10:47:56.353541 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Oct 11 10:47:56.379819 master-0 kubenswrapper[4790]: I1011 10:47:56.379673 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Oct 11 10:47:56.385361 master-0 kubenswrapper[4790]: I1011 10:47:56.385309 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Oct 11 10:47:56.678895 master-0 kubenswrapper[4790]: I1011 10:47:56.678826 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-nb76r"] Oct 11 10:47:56.740201 master-0 kubenswrapper[4790]: W1011 10:47:56.740130 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9abe9cfa_95f8_4a08_bbc2_27776956894d.slice/crio-059a63b44d0423b9932a13d24a6332d5ff6d5c9943bc0816c4feec4b89eabdf4 WatchSource:0}: Error finding container 059a63b44d0423b9932a13d24a6332d5ff6d5c9943bc0816c4feec4b89eabdf4: Status 404 returned error can't find the container with id 059a63b44d0423b9932a13d24a6332d5ff6d5c9943bc0816c4feec4b89eabdf4 Oct 11 10:47:56.823281 master-0 kubenswrapper[4790]: I1011 10:47:56.823235 4790 generic.go:334] "Generic (PLEG): container finished" podID="14286286be88b59efc7cfc15eca1cc38" containerID="bfadd2755eb7320911873101cfc631f1a704f65f1ecce019279ff9bc67ece8e4" exitCode=0 Oct 11 10:47:56.823480 master-0 kubenswrapper[4790]: I1011 10:47:56.823310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerDied","Data":"bfadd2755eb7320911873101cfc631f1a704f65f1ecce019279ff9bc67ece8e4"} Oct 11 10:47:56.823480 master-0 kubenswrapper[4790]: I1011 10:47:56.823342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerStarted","Data":"e78b164ae9dda95edce0f09c5efcd61213f9f83e4ab58beb48285be2e9c46bac"} Oct 11 10:47:56.824577 master-0 kubenswrapper[4790]: I1011 10:47:56.824537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" event={"ID":"9abe9cfa-95f8-4a08-bbc2-27776956894d","Type":"ContainerStarted","Data":"059a63b44d0423b9932a13d24a6332d5ff6d5c9943bc0816c4feec4b89eabdf4"} Oct 11 10:47:56.834182 master-0 kubenswrapper[4790]: I1011 10:47:56.834135 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" start-of-body= Oct 11 10:47:56.834331 master-0 kubenswrapper[4790]: I1011 10:47:56.834191 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": dial tcp 192.168.34.10:9980: connect: connection refused" Oct 11 10:47:57.832109 master-0 kubenswrapper[4790]: I1011 10:47:57.831984 4790 generic.go:334] "Generic (PLEG): container finished" podID="14286286be88b59efc7cfc15eca1cc38" containerID="7d63b1afde70e72ad60f45f2155003b889c6d6ab5be70efe5f737a384950ad05" exitCode=0 Oct 11 10:47:57.832109 master-0 kubenswrapper[4790]: I1011 10:47:57.832044 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerDied","Data":"7d63b1afde70e72ad60f45f2155003b889c6d6ab5be70efe5f737a384950ad05"} Oct 11 10:47:58.839749 master-0 kubenswrapper[4790]: I1011 10:47:58.839611 4790 generic.go:334] "Generic (PLEG): container finished" podID="14286286be88b59efc7cfc15eca1cc38" containerID="6591437e1f6567863066369b6d16e7e64b625afe8e9aac3f31ee299e2668dd5c" exitCode=0 Oct 11 10:47:58.839749 master-0 kubenswrapper[4790]: I1011 10:47:58.839681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerDied","Data":"6591437e1f6567863066369b6d16e7e64b625afe8e9aac3f31ee299e2668dd5c"} Oct 11 10:47:58.859926 master-0 kubenswrapper[4790]: I1011 10:47:58.857811 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w"] Oct 11 10:47:58.859926 master-0 kubenswrapper[4790]: I1011 10:47:58.858395 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:58.880742 master-0 kubenswrapper[4790]: I1011 10:47:58.879091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w"] Oct 11 10:47:58.905863 master-0 kubenswrapper[4790]: I1011 10:47:58.905813 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74561c36-ea2e-4209-9253-b6a58d832f5f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-grj2w\" (UID: \"74561c36-ea2e-4209-9253-b6a58d832f5f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:58.906017 master-0 kubenswrapper[4790]: I1011 10:47:58.905878 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48wx\" (UniqueName: \"kubernetes.io/projected/74561c36-ea2e-4209-9253-b6a58d832f5f-kube-api-access-g48wx\") pod \"cert-manager-cainjector-7d9f95dbf-grj2w\" (UID: \"74561c36-ea2e-4209-9253-b6a58d832f5f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:59.007186 master-0 kubenswrapper[4790]: I1011 10:47:59.006591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74561c36-ea2e-4209-9253-b6a58d832f5f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-grj2w\" (UID: \"74561c36-ea2e-4209-9253-b6a58d832f5f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:59.007186 master-0 kubenswrapper[4790]: I1011 10:47:59.006668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48wx\" (UniqueName: \"kubernetes.io/projected/74561c36-ea2e-4209-9253-b6a58d832f5f-kube-api-access-g48wx\") pod \"cert-manager-cainjector-7d9f95dbf-grj2w\" (UID: \"74561c36-ea2e-4209-9253-b6a58d832f5f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:59.043371 master-0 kubenswrapper[4790]: I1011 10:47:59.032718 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74561c36-ea2e-4209-9253-b6a58d832f5f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-grj2w\" (UID: \"74561c36-ea2e-4209-9253-b6a58d832f5f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:59.043371 master-0 kubenswrapper[4790]: I1011 10:47:59.037244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48wx\" (UniqueName: \"kubernetes.io/projected/74561c36-ea2e-4209-9253-b6a58d832f5f-kube-api-access-g48wx\") pod \"cert-manager-cainjector-7d9f95dbf-grj2w\" (UID: \"74561c36-ea2e-4209-9253-b6a58d832f5f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:59.199804 master-0 kubenswrapper[4790]: I1011 10:47:59.199733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" Oct 11 10:47:59.662730 master-0 kubenswrapper[4790]: I1011 10:47:59.659065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w"] Oct 11 10:47:59.848861 master-0 kubenswrapper[4790]: I1011 10:47:59.848763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerStarted","Data":"0dd6722942b406e54c78f2254c3ac8a43586d9102f113302b3c46811ed8a2fd7"} Oct 11 10:47:59.849403 master-0 kubenswrapper[4790]: I1011 10:47:59.848865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerStarted","Data":"2f7bcc72fad76121492b63caeffaf1694217e2478cc44d558ae9c6beb845205e"} Oct 11 10:47:59.849403 master-0 kubenswrapper[4790]: I1011 10:47:59.848897 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerStarted","Data":"f9192c343b8e534587ea6333b21771e1b2fc25d380603ab9c4b5eaf439343cdb"} Oct 11 10:47:59.850457 master-0 kubenswrapper[4790]: I1011 10:47:59.850395 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" event={"ID":"74561c36-ea2e-4209-9253-b6a58d832f5f","Type":"ContainerStarted","Data":"7ce06916680bfdbb808cc899797e575ec7eb561ea7b8bc94474076f5854532c7"} Oct 11 10:48:00.859838 master-0 kubenswrapper[4790]: I1011 10:48:00.859769 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerStarted","Data":"c1eb6340e8713b2b5442e1c58163b72b76e3b53b381610ba551f21765bb9d626"} Oct 11 10:48:01.870048 master-0 kubenswrapper[4790]: I1011 10:48:01.869974 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" event={"ID":"74561c36-ea2e-4209-9253-b6a58d832f5f","Type":"ContainerStarted","Data":"5d0a40cf1b74d36b15a69de30b09bf0bcf44c1620f955d762c5f522e94d65820"} Oct 11 10:48:01.872122 master-0 kubenswrapper[4790]: I1011 10:48:01.872078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" event={"ID":"9abe9cfa-95f8-4a08-bbc2-27776956894d","Type":"ContainerStarted","Data":"66674512fe4e633d963250cbabb536c6494725fe376372d0b6e06c069ecf34b0"} Oct 11 10:48:01.872312 master-0 kubenswrapper[4790]: I1011 10:48:01.872279 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:48:01.877681 master-0 kubenswrapper[4790]: I1011 10:48:01.877626 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"14286286be88b59efc7cfc15eca1cc38","Type":"ContainerStarted","Data":"b5344628af1fb60355807e51149b37a4033b5ad835d2decefc545134b802a8db"} Oct 11 10:48:01.962957 master-0 kubenswrapper[4790]: I1011 10:48:01.962847 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-grj2w" podStartSLOduration=2.720260952 podStartE2EDuration="3.962823827s" podCreationTimestamp="2025-10-11 10:47:58 +0000 UTC" firstStartedPulling="2025-10-11 10:47:59.678222274 +0000 UTC m=+556.232682566" lastFinishedPulling="2025-10-11 10:48:00.920785149 +0000 UTC m=+557.475245441" observedRunningTime="2025-10-11 10:48:01.893798011 +0000 UTC m=+558.448258373" watchObservedRunningTime="2025-10-11 10:48:01.962823827 +0000 UTC m=+558.517284119" Oct 11 10:48:01.963340 master-0 kubenswrapper[4790]: I1011 10:48:01.963002 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=5.962998671 podStartE2EDuration="5.962998671s" podCreationTimestamp="2025-10-11 10:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:48:01.958371586 +0000 UTC m=+558.512831908" watchObservedRunningTime="2025-10-11 10:48:01.962998671 +0000 UTC m=+558.517458963" Oct 11 10:48:01.989624 master-0 kubenswrapper[4790]: I1011 10:48:01.989519 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" podStartSLOduration=2.810983699 podStartE2EDuration="6.989495277s" podCreationTimestamp="2025-10-11 10:47:55 +0000 UTC" firstStartedPulling="2025-10-11 10:47:56.743027011 +0000 UTC m=+553.297487303" lastFinishedPulling="2025-10-11 10:48:00.921538579 +0000 UTC m=+557.475998881" observedRunningTime="2025-10-11 10:48:01.985858649 +0000 UTC m=+558.540318961" watchObservedRunningTime="2025-10-11 10:48:01.989495277 +0000 UTC m=+558.543955569" Oct 11 10:48:03.725816 master-0 kubenswrapper[4790]: I1011 10:48:03.725696 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq"] Oct 11 10:48:03.726605 master-0 kubenswrapper[4790]: I1011 10:48:03.726456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:03.729057 master-0 kubenswrapper[4790]: I1011 10:48:03.729015 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 11 10:48:03.729131 master-0 kubenswrapper[4790]: I1011 10:48:03.729026 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 11 10:48:03.729240 master-0 kubenswrapper[4790]: I1011 10:48:03.729207 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 11 10:48:03.730632 master-0 kubenswrapper[4790]: I1011 10:48:03.730301 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 11 10:48:03.749623 master-0 kubenswrapper[4790]: I1011 10:48:03.749565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq"] Oct 11 10:48:03.902777 master-0 kubenswrapper[4790]: I1011 10:48:03.902686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-webhook-cert\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:03.902777 master-0 kubenswrapper[4790]: I1011 10:48:03.902774 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsk8\" (UniqueName: \"kubernetes.io/projected/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-kube-api-access-xxsk8\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:03.903036 master-0 kubenswrapper[4790]: I1011 10:48:03.902799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-apiservice-cert\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.004810 master-0 kubenswrapper[4790]: I1011 10:48:04.004640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-webhook-cert\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.004810 master-0 kubenswrapper[4790]: I1011 10:48:04.004764 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsk8\" (UniqueName: \"kubernetes.io/projected/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-kube-api-access-xxsk8\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.004810 master-0 kubenswrapper[4790]: I1011 10:48:04.004802 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-apiservice-cert\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.008397 master-0 kubenswrapper[4790]: I1011 10:48:04.008343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-webhook-cert\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.009033 master-0 kubenswrapper[4790]: I1011 10:48:04.009001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-apiservice-cert\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.035276 master-0 kubenswrapper[4790]: I1011 10:48:04.035207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsk8\" (UniqueName: \"kubernetes.io/projected/01ae1cda-0c92-4f86-bff5-90e6cbb3881e-kube-api-access-xxsk8\") pod \"metallb-operator-controller-manager-56b566d9f-hppvq\" (UID: \"01ae1cda-0c92-4f86-bff5-90e6cbb3881e\") " pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.040015 master-0 kubenswrapper[4790]: I1011 10:48:04.039960 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:04.363083 master-0 kubenswrapper[4790]: I1011 10:48:04.363004 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm"] Oct 11 10:48:04.363877 master-0 kubenswrapper[4790]: I1011 10:48:04.363785 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.366558 master-0 kubenswrapper[4790]: I1011 10:48:04.366397 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 11 10:48:04.368797 master-0 kubenswrapper[4790]: I1011 10:48:04.366666 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 11 10:48:04.429627 master-0 kubenswrapper[4790]: I1011 10:48:04.385534 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm"] Oct 11 10:48:04.493793 master-0 kubenswrapper[4790]: I1011 10:48:04.493062 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq"] Oct 11 10:48:04.495269 master-0 kubenswrapper[4790]: W1011 10:48:04.495220 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ae1cda_0c92_4f86_bff5_90e6cbb3881e.slice/crio-4867139461cbffe57276c5a4e6bd53f3c3fb5ab7002db203271fe065ba453dfb WatchSource:0}: Error finding container 4867139461cbffe57276c5a4e6bd53f3c3fb5ab7002db203271fe065ba453dfb: Status 404 returned error can't find the container with id 4867139461cbffe57276c5a4e6bd53f3c3fb5ab7002db203271fe065ba453dfb Oct 11 10:48:04.531601 master-0 kubenswrapper[4790]: I1011 10:48:04.531550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d4bad0b-955f-4d0e-8849-8257c50682cb-apiservice-cert\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.531851 master-0 kubenswrapper[4790]: I1011 10:48:04.531615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5df\" (UniqueName: \"kubernetes.io/projected/3d4bad0b-955f-4d0e-8849-8257c50682cb-kube-api-access-8v5df\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.531851 master-0 kubenswrapper[4790]: I1011 10:48:04.531644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d4bad0b-955f-4d0e-8849-8257c50682cb-webhook-cert\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.633494 master-0 kubenswrapper[4790]: I1011 10:48:04.633344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5df\" (UniqueName: \"kubernetes.io/projected/3d4bad0b-955f-4d0e-8849-8257c50682cb-kube-api-access-8v5df\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.633494 master-0 kubenswrapper[4790]: I1011 10:48:04.633424 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d4bad0b-955f-4d0e-8849-8257c50682cb-webhook-cert\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.633494 master-0 kubenswrapper[4790]: I1011 10:48:04.633479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d4bad0b-955f-4d0e-8849-8257c50682cb-apiservice-cert\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.637090 master-0 kubenswrapper[4790]: I1011 10:48:04.637054 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d4bad0b-955f-4d0e-8849-8257c50682cb-apiservice-cert\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.638336 master-0 kubenswrapper[4790]: I1011 10:48:04.638280 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d4bad0b-955f-4d0e-8849-8257c50682cb-webhook-cert\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.667797 master-0 kubenswrapper[4790]: I1011 10:48:04.667727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5df\" (UniqueName: \"kubernetes.io/projected/3d4bad0b-955f-4d0e-8849-8257c50682cb-kube-api-access-8v5df\") pod \"metallb-operator-webhook-server-84d69c968c-btbcm\" (UID: \"3d4bad0b-955f-4d0e-8849-8257c50682cb\") " pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.732802 master-0 kubenswrapper[4790]: I1011 10:48:04.732727 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:04.899505 master-0 kubenswrapper[4790]: I1011 10:48:04.899345 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" event={"ID":"01ae1cda-0c92-4f86-bff5-90e6cbb3881e","Type":"ContainerStarted","Data":"4867139461cbffe57276c5a4e6bd53f3c3fb5ab7002db203271fe065ba453dfb"} Oct 11 10:48:05.122805 master-0 kubenswrapper[4790]: I1011 10:48:05.122224 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm"] Oct 11 10:48:05.126947 master-0 kubenswrapper[4790]: W1011 10:48:05.126877 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d4bad0b_955f_4d0e_8849_8257c50682cb.slice/crio-57bde74650573cb025cf5a842d2c32a6ebe4755fd5aa0ac986b69535ad98665c WatchSource:0}: Error finding container 57bde74650573cb025cf5a842d2c32a6ebe4755fd5aa0ac986b69535ad98665c: Status 404 returned error can't find the container with id 57bde74650573cb025cf5a842d2c32a6ebe4755fd5aa0ac986b69535ad98665c Oct 11 10:48:05.912737 master-0 kubenswrapper[4790]: I1011 10:48:05.911919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" event={"ID":"3d4bad0b-955f-4d0e-8849-8257c50682cb","Type":"ContainerStarted","Data":"57bde74650573cb025cf5a842d2c32a6ebe4755fd5aa0ac986b69535ad98665c"} Oct 11 10:48:06.211880 master-0 kubenswrapper[4790]: I1011 10:48:06.211737 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-nb76r" Oct 11 10:48:06.390016 master-0 kubenswrapper[4790]: I1011 10:48:06.389947 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Oct 11 10:48:06.392588 master-0 kubenswrapper[4790]: I1011 10:48:06.390848 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Oct 11 10:48:06.833737 master-0 kubenswrapper[4790]: I1011 10:48:06.833635 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:06.834069 master-0 kubenswrapper[4790]: I1011 10:48:06.833752 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:07.928032 master-0 kubenswrapper[4790]: I1011 10:48:07.927688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" event={"ID":"01ae1cda-0c92-4f86-bff5-90e6cbb3881e","Type":"ContainerStarted","Data":"e99479718a5b7e2800963c9d04442b4c128d951c09212a5b57a9152a94e6b303"} Oct 11 10:48:07.928032 master-0 kubenswrapper[4790]: I1011 10:48:07.927932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:07.968883 master-0 kubenswrapper[4790]: I1011 10:48:07.968727 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" podStartSLOduration=1.802334223 podStartE2EDuration="4.968684992s" podCreationTimestamp="2025-10-11 10:48:03 +0000 UTC" firstStartedPulling="2025-10-11 10:48:04.50134625 +0000 UTC m=+561.055806542" lastFinishedPulling="2025-10-11 10:48:07.667697019 +0000 UTC m=+564.222157311" observedRunningTime="2025-10-11 10:48:07.964341425 +0000 UTC m=+564.518801717" watchObservedRunningTime="2025-10-11 10:48:07.968684992 +0000 UTC m=+564.523145284" Oct 11 10:48:08.470641 master-0 kubenswrapper[4790]: I1011 10:48:08.470542 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj"] Oct 11 10:48:08.471303 master-0 kubenswrapper[4790]: I1011 10:48:08.471280 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" Oct 11 10:48:08.473584 master-0 kubenswrapper[4790]: I1011 10:48:08.473514 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 11 10:48:08.474005 master-0 kubenswrapper[4790]: I1011 10:48:08.473952 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 11 10:48:08.485347 master-0 kubenswrapper[4790]: I1011 10:48:08.485278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj"] Oct 11 10:48:08.505229 master-0 kubenswrapper[4790]: I1011 10:48:08.505175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqc4\" (UniqueName: \"kubernetes.io/projected/6b296384-0413-4a1d-825b-530b97e53c9a-kube-api-access-pqqc4\") pod \"nmstate-operator-858ddd8f98-pnhrj\" (UID: \"6b296384-0413-4a1d-825b-530b97e53c9a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" Oct 11 10:48:08.606567 master-0 kubenswrapper[4790]: I1011 10:48:08.606503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqc4\" (UniqueName: \"kubernetes.io/projected/6b296384-0413-4a1d-825b-530b97e53c9a-kube-api-access-pqqc4\") pod \"nmstate-operator-858ddd8f98-pnhrj\" (UID: \"6b296384-0413-4a1d-825b-530b97e53c9a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" Oct 11 10:48:08.638740 master-0 kubenswrapper[4790]: I1011 10:48:08.633898 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqc4\" (UniqueName: \"kubernetes.io/projected/6b296384-0413-4a1d-825b-530b97e53c9a-kube-api-access-pqqc4\") pod \"nmstate-operator-858ddd8f98-pnhrj\" (UID: \"6b296384-0413-4a1d-825b-530b97e53c9a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" Oct 11 10:48:08.823420 master-0 kubenswrapper[4790]: I1011 10:48:08.823325 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" Oct 11 10:48:09.907889 master-0 kubenswrapper[4790]: I1011 10:48:09.907674 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj"] Oct 11 10:48:09.918026 master-0 kubenswrapper[4790]: W1011 10:48:09.917952 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b296384_0413_4a1d_825b_530b97e53c9a.slice/crio-8ddac305cd29bae1b3728337a406fac4652bf7d6296684a083d2e3d6f72d0c14 WatchSource:0}: Error finding container 8ddac305cd29bae1b3728337a406fac4652bf7d6296684a083d2e3d6f72d0c14: Status 404 returned error can't find the container with id 8ddac305cd29bae1b3728337a406fac4652bf7d6296684a083d2e3d6f72d0c14 Oct 11 10:48:09.940030 master-0 kubenswrapper[4790]: I1011 10:48:09.939924 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" event={"ID":"6b296384-0413-4a1d-825b-530b97e53c9a","Type":"ContainerStarted","Data":"8ddac305cd29bae1b3728337a406fac4652bf7d6296684a083d2e3d6f72d0c14"} Oct 11 10:48:09.942944 master-0 kubenswrapper[4790]: I1011 10:48:09.942285 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" event={"ID":"3d4bad0b-955f-4d0e-8849-8257c50682cb","Type":"ContainerStarted","Data":"75a2cd5e768356f1bb494fc5d056e2b1d7d6ecf960512659b6f293f638834254"} Oct 11 10:48:09.942944 master-0 kubenswrapper[4790]: I1011 10:48:09.942888 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:11.834310 master-0 kubenswrapper[4790]: I1011 10:48:11.833940 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:11.836133 master-0 kubenswrapper[4790]: I1011 10:48:11.834349 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:11.956673 master-0 kubenswrapper[4790]: I1011 10:48:11.956559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" event={"ID":"6b296384-0413-4a1d-825b-530b97e53c9a","Type":"ContainerStarted","Data":"371194d2a4de8f9948b4470f90b110478fc7afbffb9265643326ed10249c415c"} Oct 11 10:48:11.985801 master-0 kubenswrapper[4790]: I1011 10:48:11.985696 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" podStartSLOduration=3.597787519 podStartE2EDuration="7.985670597s" podCreationTimestamp="2025-10-11 10:48:04 +0000 UTC" firstStartedPulling="2025-10-11 10:48:05.130686195 +0000 UTC m=+561.685146487" lastFinishedPulling="2025-10-11 10:48:09.518569273 +0000 UTC m=+566.073029565" observedRunningTime="2025-10-11 10:48:09.975375366 +0000 UTC m=+566.529835678" watchObservedRunningTime="2025-10-11 10:48:11.985670597 +0000 UTC m=+568.540130889" Oct 11 10:48:11.986103 master-0 kubenswrapper[4790]: I1011 10:48:11.985846 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pnhrj" podStartSLOduration=2.124419363 podStartE2EDuration="3.985842021s" podCreationTimestamp="2025-10-11 10:48:08 +0000 UTC" firstStartedPulling="2025-10-11 10:48:09.92147442 +0000 UTC m=+566.475934722" lastFinishedPulling="2025-10-11 10:48:11.782897078 +0000 UTC m=+568.337357380" observedRunningTime="2025-10-11 10:48:11.984482465 +0000 UTC m=+568.538942767" watchObservedRunningTime="2025-10-11 10:48:11.985842021 +0000 UTC m=+568.540302313" Oct 11 10:48:14.892700 master-0 kubenswrapper[4790]: I1011 10:48:14.892616 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp"] Oct 11 10:48:14.893566 master-0 kubenswrapper[4790]: I1011 10:48:14.893534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" Oct 11 10:48:14.896457 master-0 kubenswrapper[4790]: I1011 10:48:14.896406 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 11 10:48:14.897062 master-0 kubenswrapper[4790]: I1011 10:48:14.897027 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 11 10:48:14.904644 master-0 kubenswrapper[4790]: I1011 10:48:14.904581 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmnvz\" (UniqueName: \"kubernetes.io/projected/fd5cd971-0d18-4313-9102-4b59431a75ab-kube-api-access-rmnvz\") pod \"obo-prometheus-operator-7c8cf85677-8bmlp\" (UID: \"fd5cd971-0d18-4313-9102-4b59431a75ab\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" Oct 11 10:48:14.908401 master-0 kubenswrapper[4790]: I1011 10:48:14.908328 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp"] Oct 11 10:48:15.006853 master-0 kubenswrapper[4790]: I1011 10:48:15.006277 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmnvz\" (UniqueName: \"kubernetes.io/projected/fd5cd971-0d18-4313-9102-4b59431a75ab-kube-api-access-rmnvz\") pod \"obo-prometheus-operator-7c8cf85677-8bmlp\" (UID: \"fd5cd971-0d18-4313-9102-4b59431a75ab\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" Oct 11 10:48:15.016568 master-0 kubenswrapper[4790]: I1011 10:48:15.016464 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-9nqxf"] Oct 11 10:48:15.017461 master-0 kubenswrapper[4790]: I1011 10:48:15.017429 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.037812 master-0 kubenswrapper[4790]: I1011 10:48:15.032004 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-9nqxf"] Oct 11 10:48:15.043748 master-0 kubenswrapper[4790]: I1011 10:48:15.043675 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmnvz\" (UniqueName: \"kubernetes.io/projected/fd5cd971-0d18-4313-9102-4b59431a75ab-kube-api-access-rmnvz\") pod \"obo-prometheus-operator-7c8cf85677-8bmlp\" (UID: \"fd5cd971-0d18-4313-9102-4b59431a75ab\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" Oct 11 10:48:15.058755 master-0 kubenswrapper[4790]: I1011 10:48:15.055901 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92"] Oct 11 10:48:15.058755 master-0 kubenswrapper[4790]: I1011 10:48:15.057907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.062756 master-0 kubenswrapper[4790]: I1011 10:48:15.060861 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 11 10:48:15.062756 master-0 kubenswrapper[4790]: I1011 10:48:15.062602 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92"] Oct 11 10:48:15.107196 master-0 kubenswrapper[4790]: I1011 10:48:15.107160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a65018a-6409-43ce-abe4-498a3ea576d4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-9nqxf\" (UID: \"0a65018a-6409-43ce-abe4-498a3ea576d4\") " pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.107334 master-0 kubenswrapper[4790]: I1011 10:48:15.107210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01cf17dd-ef3f-47aa-8779-a099fc6d45a1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92\" (UID: \"01cf17dd-ef3f-47aa-8779-a099fc6d45a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.107334 master-0 kubenswrapper[4790]: I1011 10:48:15.107263 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01cf17dd-ef3f-47aa-8779-a099fc6d45a1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92\" (UID: \"01cf17dd-ef3f-47aa-8779-a099fc6d45a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.107334 master-0 kubenswrapper[4790]: I1011 10:48:15.107307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2c8f\" (UniqueName: \"kubernetes.io/projected/0a65018a-6409-43ce-abe4-498a3ea576d4-kube-api-access-q2c8f\") pod \"cert-manager-7d4cc89fcb-9nqxf\" (UID: \"0a65018a-6409-43ce-abe4-498a3ea576d4\") " pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.209126 master-0 kubenswrapper[4790]: I1011 10:48:15.208946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a65018a-6409-43ce-abe4-498a3ea576d4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-9nqxf\" (UID: \"0a65018a-6409-43ce-abe4-498a3ea576d4\") " pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.209126 master-0 kubenswrapper[4790]: I1011 10:48:15.209040 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01cf17dd-ef3f-47aa-8779-a099fc6d45a1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92\" (UID: \"01cf17dd-ef3f-47aa-8779-a099fc6d45a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.209126 master-0 kubenswrapper[4790]: I1011 10:48:15.209078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01cf17dd-ef3f-47aa-8779-a099fc6d45a1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92\" (UID: \"01cf17dd-ef3f-47aa-8779-a099fc6d45a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.209126 master-0 kubenswrapper[4790]: I1011 10:48:15.209106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2c8f\" (UniqueName: \"kubernetes.io/projected/0a65018a-6409-43ce-abe4-498a3ea576d4-kube-api-access-q2c8f\") pod \"cert-manager-7d4cc89fcb-9nqxf\" (UID: \"0a65018a-6409-43ce-abe4-498a3ea576d4\") " pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.211765 master-0 kubenswrapper[4790]: I1011 10:48:15.211672 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" Oct 11 10:48:15.214016 master-0 kubenswrapper[4790]: I1011 10:48:15.213952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01cf17dd-ef3f-47aa-8779-a099fc6d45a1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92\" (UID: \"01cf17dd-ef3f-47aa-8779-a099fc6d45a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.214131 master-0 kubenswrapper[4790]: I1011 10:48:15.214027 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01cf17dd-ef3f-47aa-8779-a099fc6d45a1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92\" (UID: \"01cf17dd-ef3f-47aa-8779-a099fc6d45a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.219984 master-0 kubenswrapper[4790]: I1011 10:48:15.219809 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-4pfh4"] Oct 11 10:48:15.220575 master-0 kubenswrapper[4790]: I1011 10:48:15.220536 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.223772 master-0 kubenswrapper[4790]: I1011 10:48:15.223698 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 11 10:48:15.237167 master-0 kubenswrapper[4790]: I1011 10:48:15.237105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a65018a-6409-43ce-abe4-498a3ea576d4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-9nqxf\" (UID: \"0a65018a-6409-43ce-abe4-498a3ea576d4\") " pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.238830 master-0 kubenswrapper[4790]: I1011 10:48:15.238783 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-4pfh4"] Oct 11 10:48:15.241637 master-0 kubenswrapper[4790]: I1011 10:48:15.240812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2c8f\" (UniqueName: \"kubernetes.io/projected/0a65018a-6409-43ce-abe4-498a3ea576d4-kube-api-access-q2c8f\") pod \"cert-manager-7d4cc89fcb-9nqxf\" (UID: \"0a65018a-6409-43ce-abe4-498a3ea576d4\") " pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.321200 master-0 kubenswrapper[4790]: I1011 10:48:15.311054 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-4pfh4\" (UID: \"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba\") " pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.321200 master-0 kubenswrapper[4790]: I1011 10:48:15.311212 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6rts\" (UniqueName: \"kubernetes.io/projected/5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba-kube-api-access-t6rts\") pod \"observability-operator-cc5f78dfc-4pfh4\" (UID: \"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba\") " pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.387926 master-0 kubenswrapper[4790]: I1011 10:48:15.387844 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" Oct 11 10:48:15.397008 master-0 kubenswrapper[4790]: I1011 10:48:15.396935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" Oct 11 10:48:15.412910 master-0 kubenswrapper[4790]: I1011 10:48:15.412850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6rts\" (UniqueName: \"kubernetes.io/projected/5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba-kube-api-access-t6rts\") pod \"observability-operator-cc5f78dfc-4pfh4\" (UID: \"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba\") " pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.413020 master-0 kubenswrapper[4790]: I1011 10:48:15.412947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-4pfh4\" (UID: \"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba\") " pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.423760 master-0 kubenswrapper[4790]: I1011 10:48:15.423652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-4pfh4\" (UID: \"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba\") " pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.443664 master-0 kubenswrapper[4790]: I1011 10:48:15.443593 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-l5f8k"] Oct 11 10:48:15.444881 master-0 kubenswrapper[4790]: I1011 10:48:15.444829 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.462500 master-0 kubenswrapper[4790]: I1011 10:48:15.461305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6rts\" (UniqueName: \"kubernetes.io/projected/5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba-kube-api-access-t6rts\") pod \"observability-operator-cc5f78dfc-4pfh4\" (UID: \"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba\") " pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.463735 master-0 kubenswrapper[4790]: I1011 10:48:15.463515 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-l5f8k"] Oct 11 10:48:15.514929 master-0 kubenswrapper[4790]: I1011 10:48:15.514871 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f45c80-0e01-450b-9b74-00b327f44495-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-l5f8k\" (UID: \"d3f45c80-0e01-450b-9b74-00b327f44495\") " pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.515202 master-0 kubenswrapper[4790]: I1011 10:48:15.514950 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqqq\" (UniqueName: \"kubernetes.io/projected/d3f45c80-0e01-450b-9b74-00b327f44495-kube-api-access-qcqqq\") pod \"perses-operator-54bc95c9fb-l5f8k\" (UID: \"d3f45c80-0e01-450b-9b74-00b327f44495\") " pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.596853 master-0 kubenswrapper[4790]: I1011 10:48:15.579613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:15.621744 master-0 kubenswrapper[4790]: I1011 10:48:15.620142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqqq\" (UniqueName: \"kubernetes.io/projected/d3f45c80-0e01-450b-9b74-00b327f44495-kube-api-access-qcqqq\") pod \"perses-operator-54bc95c9fb-l5f8k\" (UID: \"d3f45c80-0e01-450b-9b74-00b327f44495\") " pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.621744 master-0 kubenswrapper[4790]: I1011 10:48:15.620260 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f45c80-0e01-450b-9b74-00b327f44495-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-l5f8k\" (UID: \"d3f45c80-0e01-450b-9b74-00b327f44495\") " pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.621744 master-0 kubenswrapper[4790]: I1011 10:48:15.621373 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f45c80-0e01-450b-9b74-00b327f44495-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-l5f8k\" (UID: \"d3f45c80-0e01-450b-9b74-00b327f44495\") " pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.661036 master-0 kubenswrapper[4790]: I1011 10:48:15.660402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqqq\" (UniqueName: \"kubernetes.io/projected/d3f45c80-0e01-450b-9b74-00b327f44495-kube-api-access-qcqqq\") pod \"perses-operator-54bc95c9fb-l5f8k\" (UID: \"d3f45c80-0e01-450b-9b74-00b327f44495\") " pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.682966 master-0 kubenswrapper[4790]: I1011 10:48:15.682917 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp"] Oct 11 10:48:15.691175 master-0 kubenswrapper[4790]: W1011 10:48:15.691096 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5cd971_0d18_4313_9102_4b59431a75ab.slice/crio-44787949514ee5f01c4f1136a51355f7d15ab47cb0068ff543214028080864e7 WatchSource:0}: Error finding container 44787949514ee5f01c4f1136a51355f7d15ab47cb0068ff543214028080864e7: Status 404 returned error can't find the container with id 44787949514ee5f01c4f1136a51355f7d15ab47cb0068ff543214028080864e7 Oct 11 10:48:15.771250 master-0 kubenswrapper[4790]: I1011 10:48:15.771169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:15.914439 master-0 kubenswrapper[4790]: I1011 10:48:15.914379 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92"] Oct 11 10:48:15.932500 master-0 kubenswrapper[4790]: W1011 10:48:15.932434 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01cf17dd_ef3f_47aa_8779_a099fc6d45a1.slice/crio-ad4e8b63f47d74158b4a2a4ab9dff92c64f3eab6c4e6abe1f03ebe05260c0249 WatchSource:0}: Error finding container ad4e8b63f47d74158b4a2a4ab9dff92c64f3eab6c4e6abe1f03ebe05260c0249: Status 404 returned error can't find the container with id ad4e8b63f47d74158b4a2a4ab9dff92c64f3eab6c4e6abe1f03ebe05260c0249 Oct 11 10:48:15.969682 master-0 kubenswrapper[4790]: I1011 10:48:15.969628 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-9nqxf"] Oct 11 10:48:15.979257 master-0 kubenswrapper[4790]: W1011 10:48:15.979131 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a65018a_6409_43ce_abe4_498a3ea576d4.slice/crio-4fa17b0853985692154fb7e022d85361b89c0e601389d84543a7c88a44717d99 WatchSource:0}: Error finding container 4fa17b0853985692154fb7e022d85361b89c0e601389d84543a7c88a44717d99: Status 404 returned error can't find the container with id 4fa17b0853985692154fb7e022d85361b89c0e601389d84543a7c88a44717d99 Oct 11 10:48:15.993350 master-0 kubenswrapper[4790]: I1011 10:48:15.993287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" event={"ID":"0a65018a-6409-43ce-abe4-498a3ea576d4","Type":"ContainerStarted","Data":"4fa17b0853985692154fb7e022d85361b89c0e601389d84543a7c88a44717d99"} Oct 11 10:48:15.994291 master-0 kubenswrapper[4790]: I1011 10:48:15.994263 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" event={"ID":"01cf17dd-ef3f-47aa-8779-a099fc6d45a1","Type":"ContainerStarted","Data":"ad4e8b63f47d74158b4a2a4ab9dff92c64f3eab6c4e6abe1f03ebe05260c0249"} Oct 11 10:48:15.996535 master-0 kubenswrapper[4790]: I1011 10:48:15.996505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" event={"ID":"fd5cd971-0d18-4313-9102-4b59431a75ab","Type":"ContainerStarted","Data":"44787949514ee5f01c4f1136a51355f7d15ab47cb0068ff543214028080864e7"} Oct 11 10:48:16.045679 master-0 kubenswrapper[4790]: I1011 10:48:16.045626 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-4pfh4"] Oct 11 10:48:16.063865 master-0 kubenswrapper[4790]: W1011 10:48:16.063815 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9eb95a_71cf_4cf8_b4c3_4ed5f3ca1fba.slice/crio-c19be77ab7510ac82d99e8c75ec458c8bd6b168ae4068f3c735d102ca03221d5 WatchSource:0}: Error finding container c19be77ab7510ac82d99e8c75ec458c8bd6b168ae4068f3c735d102ca03221d5: Status 404 returned error can't find the container with id c19be77ab7510ac82d99e8c75ec458c8bd6b168ae4068f3c735d102ca03221d5 Oct 11 10:48:16.197211 master-0 kubenswrapper[4790]: I1011 10:48:16.197164 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-l5f8k"] Oct 11 10:48:16.835211 master-0 kubenswrapper[4790]: I1011 10:48:16.835096 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": context deadline exceeded" start-of-body= Oct 11 10:48:16.835562 master-0 kubenswrapper[4790]: I1011 10:48:16.835216 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": context deadline exceeded" Oct 11 10:48:17.004919 master-0 kubenswrapper[4790]: I1011 10:48:17.004847 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" event={"ID":"0a65018a-6409-43ce-abe4-498a3ea576d4","Type":"ContainerStarted","Data":"30fd3faaf46dd915bca4f2363c1729938ca49265cabbdd70259cfbd58b1e4c40"} Oct 11 10:48:17.006427 master-0 kubenswrapper[4790]: I1011 10:48:17.006372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" event={"ID":"d3f45c80-0e01-450b-9b74-00b327f44495","Type":"ContainerStarted","Data":"871b9ad1e0e74662f9d2a8c4d16a588d83f5502ab2e4686816d5e6c5d4c33dcd"} Oct 11 10:48:17.007911 master-0 kubenswrapper[4790]: I1011 10:48:17.007842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" event={"ID":"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba","Type":"ContainerStarted","Data":"c19be77ab7510ac82d99e8c75ec458c8bd6b168ae4068f3c735d102ca03221d5"} Oct 11 10:48:17.032314 master-0 kubenswrapper[4790]: I1011 10:48:17.032224 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-9nqxf" podStartSLOduration=3.032199221 podStartE2EDuration="3.032199221s" podCreationTimestamp="2025-10-11 10:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:48:17.029014175 +0000 UTC m=+573.583474477" watchObservedRunningTime="2025-10-11 10:48:17.032199221 +0000 UTC m=+573.586659523" Oct 11 10:48:17.382011 master-0 kubenswrapper[4790]: I1011 10:48:17.381931 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:17.382247 master-0 kubenswrapper[4790]: I1011 10:48:17.382030 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="14286286be88b59efc7cfc15eca1cc38" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:20.040972 master-0 kubenswrapper[4790]: I1011 10:48:20.040768 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" event={"ID":"01cf17dd-ef3f-47aa-8779-a099fc6d45a1","Type":"ContainerStarted","Data":"640f070bcaba1a206fe6799d3c0b2ae9f35cb5252496f3b9d8cd5711b4ab8424"} Oct 11 10:48:20.042277 master-0 kubenswrapper[4790]: I1011 10:48:20.042213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" event={"ID":"d3f45c80-0e01-450b-9b74-00b327f44495","Type":"ContainerStarted","Data":"224c13404fda1e0a268044002e71d817076002a03cf1071f77ab909d22278ddf"} Oct 11 10:48:20.042387 master-0 kubenswrapper[4790]: I1011 10:48:20.042368 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:20.043765 master-0 kubenswrapper[4790]: I1011 10:48:20.043692 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" event={"ID":"fd5cd971-0d18-4313-9102-4b59431a75ab","Type":"ContainerStarted","Data":"935d2b311bfb697ce85dffbccfe171c48240c15188be19c47235cbdd1267003a"} Oct 11 10:48:20.076777 master-0 kubenswrapper[4790]: I1011 10:48:20.076699 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8564d76cc6-kfp92" podStartSLOduration=1.539420115 podStartE2EDuration="5.076679207s" podCreationTimestamp="2025-10-11 10:48:15 +0000 UTC" firstStartedPulling="2025-10-11 10:48:15.935112636 +0000 UTC m=+572.489572938" lastFinishedPulling="2025-10-11 10:48:19.472371738 +0000 UTC m=+576.026832030" observedRunningTime="2025-10-11 10:48:20.074394305 +0000 UTC m=+576.628854617" watchObservedRunningTime="2025-10-11 10:48:20.076679207 +0000 UTC m=+576.631139499" Oct 11 10:48:20.102904 master-0 kubenswrapper[4790]: I1011 10:48:20.102822 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" podStartSLOduration=1.830063389 podStartE2EDuration="5.102804243s" podCreationTimestamp="2025-10-11 10:48:15 +0000 UTC" firstStartedPulling="2025-10-11 10:48:16.202410909 +0000 UTC m=+572.756871201" lastFinishedPulling="2025-10-11 10:48:19.475151773 +0000 UTC m=+576.029612055" observedRunningTime="2025-10-11 10:48:20.102061933 +0000 UTC m=+576.656522235" watchObservedRunningTime="2025-10-11 10:48:20.102804243 +0000 UTC m=+576.657264535" Oct 11 10:48:21.838815 master-0 kubenswrapper[4790]: I1011 10:48:21.836001 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:21.838815 master-0 kubenswrapper[4790]: I1011 10:48:21.836083 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:22.057577 master-0 kubenswrapper[4790]: I1011 10:48:22.057502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" event={"ID":"5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba","Type":"ContainerStarted","Data":"f5da2d26b334a7a67d54c3b734cf64aecff03f3b8841b245bba51d74136ddab1"} Oct 11 10:48:22.058033 master-0 kubenswrapper[4790]: I1011 10:48:22.057992 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:22.060201 master-0 kubenswrapper[4790]: I1011 10:48:22.060163 4790 patch_prober.go:28] interesting pod/observability-operator-cc5f78dfc-4pfh4 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.130.0.42:8081/healthz\": dial tcp 10.130.0.42:8081: connect: connection refused" start-of-body= Oct 11 10:48:22.060277 master-0 kubenswrapper[4790]: I1011 10:48:22.060226 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" podUID="5c9eb95a-71cf-4cf8-b4c3-4ed5f3ca1fba" containerName="operator" probeResult="failure" output="Get \"http://10.130.0.42:8081/healthz\": dial tcp 10.130.0.42:8081: connect: connection refused" Oct 11 10:48:22.098118 master-0 kubenswrapper[4790]: I1011 10:48:22.098042 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8bmlp" podStartSLOduration=4.3068726139999995 podStartE2EDuration="8.098019846s" podCreationTimestamp="2025-10-11 10:48:14 +0000 UTC" firstStartedPulling="2025-10-11 10:48:15.693749694 +0000 UTC m=+572.248209976" lastFinishedPulling="2025-10-11 10:48:19.484896916 +0000 UTC m=+576.039357208" observedRunningTime="2025-10-11 10:48:20.142034133 +0000 UTC m=+576.696494425" watchObservedRunningTime="2025-10-11 10:48:22.098019846 +0000 UTC m=+578.652480138" Oct 11 10:48:22.100882 master-0 kubenswrapper[4790]: I1011 10:48:22.100823 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" podStartSLOduration=1.412032932 podStartE2EDuration="7.100809761s" podCreationTimestamp="2025-10-11 10:48:15 +0000 UTC" firstStartedPulling="2025-10-11 10:48:16.067443071 +0000 UTC m=+572.621903363" lastFinishedPulling="2025-10-11 10:48:21.7562199 +0000 UTC m=+578.310680192" observedRunningTime="2025-10-11 10:48:22.09631243 +0000 UTC m=+578.650772732" watchObservedRunningTime="2025-10-11 10:48:22.100809761 +0000 UTC m=+578.655270053" Oct 11 10:48:23.121817 master-0 kubenswrapper[4790]: I1011 10:48:23.121768 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-4pfh4" Oct 11 10:48:24.740429 master-0 kubenswrapper[4790]: I1011 10:48:24.740350 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-84d69c968c-btbcm" Oct 11 10:48:25.775819 master-0 kubenswrapper[4790]: I1011 10:48:25.775764 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-l5f8k" Oct 11 10:48:26.836876 master-0 kubenswrapper[4790]: I1011 10:48:26.836603 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": context deadline exceeded" start-of-body= Oct 11 10:48:26.837464 master-0 kubenswrapper[4790]: I1011 10:48:26.836900 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": context deadline exceeded" Oct 11 10:48:27.381473 master-0 kubenswrapper[4790]: I1011 10:48:27.381341 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:27.381961 master-0 kubenswrapper[4790]: I1011 10:48:27.381492 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="14286286be88b59efc7cfc15eca1cc38" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:31.838061 master-0 kubenswrapper[4790]: I1011 10:48:31.837971 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:31.838966 master-0 kubenswrapper[4790]: I1011 10:48:31.838081 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:36.839051 master-0 kubenswrapper[4790]: I1011 10:48:36.838923 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:36.839051 master-0 kubenswrapper[4790]: I1011 10:48:36.839049 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:37.382308 master-0 kubenswrapper[4790]: I1011 10:48:37.382169 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:37.382308 master-0 kubenswrapper[4790]: I1011 10:48:37.382292 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="14286286be88b59efc7cfc15eca1cc38" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:41.839416 master-0 kubenswrapper[4790]: I1011 10:48:41.839308 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:41.840461 master-0 kubenswrapper[4790]: I1011 10:48:41.839448 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:44.044607 master-0 kubenswrapper[4790]: I1011 10:48:44.044488 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-56b566d9f-hppvq" Oct 11 10:48:46.840077 master-0 kubenswrapper[4790]: I1011 10:48:46.839927 4790 patch_prober.go:28] interesting pod/etcd-guard-master-0 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:46.840077 master-0 kubenswrapper[4790]: I1011 10:48:46.840047 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-0" podUID="c6436766-e7b0-471b-acbf-861280191521" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:47.381673 master-0 kubenswrapper[4790]: I1011 10:48:47.381553 4790 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 10:48:47.381673 master-0 kubenswrapper[4790]: I1011 10:48:47.381661 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="14286286be88b59efc7cfc15eca1cc38" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.10:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:48:47.922601 master-0 kubenswrapper[4790]: I1011 10:48:47.922529 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-guard-master-0" Oct 11 10:48:53.291754 master-0 kubenswrapper[4790]: I1011 10:48:53.291646 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w"] Oct 11 10:48:53.293180 master-0 kubenswrapper[4790]: I1011 10:48:53.293137 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:53.297635 master-0 kubenswrapper[4790]: I1011 10:48:53.297582 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 11 10:48:53.394020 master-0 kubenswrapper[4790]: I1011 10:48:53.393909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7359c204-2acb-4c3b-b05f-2a124f3862fb-cert\") pod \"frr-k8s-webhook-server-64bf5d555-54x4w\" (UID: \"7359c204-2acb-4c3b-b05f-2a124f3862fb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:53.394466 master-0 kubenswrapper[4790]: I1011 10:48:53.394141 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6rtf\" (UniqueName: \"kubernetes.io/projected/7359c204-2acb-4c3b-b05f-2a124f3862fb-kube-api-access-n6rtf\") pod \"frr-k8s-webhook-server-64bf5d555-54x4w\" (UID: \"7359c204-2acb-4c3b-b05f-2a124f3862fb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:53.495981 master-0 kubenswrapper[4790]: I1011 10:48:53.495896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6rtf\" (UniqueName: \"kubernetes.io/projected/7359c204-2acb-4c3b-b05f-2a124f3862fb-kube-api-access-n6rtf\") pod \"frr-k8s-webhook-server-64bf5d555-54x4w\" (UID: \"7359c204-2acb-4c3b-b05f-2a124f3862fb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:53.496286 master-0 kubenswrapper[4790]: I1011 10:48:53.496030 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7359c204-2acb-4c3b-b05f-2a124f3862fb-cert\") pod \"frr-k8s-webhook-server-64bf5d555-54x4w\" (UID: \"7359c204-2acb-4c3b-b05f-2a124f3862fb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:53.502121 master-0 kubenswrapper[4790]: I1011 10:48:53.502055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7359c204-2acb-4c3b-b05f-2a124f3862fb-cert\") pod \"frr-k8s-webhook-server-64bf5d555-54x4w\" (UID: \"7359c204-2acb-4c3b-b05f-2a124f3862fb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:53.901288 master-0 kubenswrapper[4790]: I1011 10:48:53.901096 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w"] Oct 11 10:48:53.923627 master-0 kubenswrapper[4790]: I1011 10:48:53.923564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6rtf\" (UniqueName: \"kubernetes.io/projected/7359c204-2acb-4c3b-b05f-2a124f3862fb-kube-api-access-n6rtf\") pod \"frr-k8s-webhook-server-64bf5d555-54x4w\" (UID: \"7359c204-2acb-4c3b-b05f-2a124f3862fb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:54.051814 master-0 kubenswrapper[4790]: I1011 10:48:54.051755 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5xkrb"] Oct 11 10:48:54.054504 master-0 kubenswrapper[4790]: I1011 10:48:54.054479 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.058694 master-0 kubenswrapper[4790]: I1011 10:48:54.058668 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 11 10:48:54.058909 master-0 kubenswrapper[4790]: I1011 10:48:54.058701 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 11 10:48:54.107150 master-0 kubenswrapper[4790]: I1011 10:48:54.107078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd096860-a678-4b71-a23d-70ecd6b79a0d-metrics-certs\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.107409 master-0 kubenswrapper[4790]: I1011 10:48:54.107156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-metrics\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.107409 master-0 kubenswrapper[4790]: I1011 10:48:54.107204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-startup\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.107409 master-0 kubenswrapper[4790]: I1011 10:48:54.107318 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcq4\" (UniqueName: \"kubernetes.io/projected/bd096860-a678-4b71-a23d-70ecd6b79a0d-kube-api-access-6mcq4\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.107409 master-0 kubenswrapper[4790]: I1011 10:48:54.107362 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-conf\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.107589 master-0 kubenswrapper[4790]: I1011 10:48:54.107471 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-sockets\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.107589 master-0 kubenswrapper[4790]: I1011 10:48:54.107508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-reloader\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.209444 master-0 kubenswrapper[4790]: I1011 10:48:54.209254 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-sockets\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.209776 master-0 kubenswrapper[4790]: I1011 10:48:54.209758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-reloader\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.209876 master-0 kubenswrapper[4790]: I1011 10:48:54.209864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd096860-a678-4b71-a23d-70ecd6b79a0d-metrics-certs\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.209967 master-0 kubenswrapper[4790]: I1011 10:48:54.209953 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-metrics\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.210042 master-0 kubenswrapper[4790]: I1011 10:48:54.210030 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-startup\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.210158 master-0 kubenswrapper[4790]: I1011 10:48:54.210146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-conf\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.210267 master-0 kubenswrapper[4790]: I1011 10:48:54.210249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcq4\" (UniqueName: \"kubernetes.io/projected/bd096860-a678-4b71-a23d-70ecd6b79a0d-kube-api-access-6mcq4\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.211186 master-0 kubenswrapper[4790]: I1011 10:48:54.210166 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-sockets\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.211186 master-0 kubenswrapper[4790]: I1011 10:48:54.210570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-reloader\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.211186 master-0 kubenswrapper[4790]: I1011 10:48:54.210652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-conf\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.211186 master-0 kubenswrapper[4790]: I1011 10:48:54.210872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bd096860-a678-4b71-a23d-70ecd6b79a0d-metrics\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.211390 master-0 kubenswrapper[4790]: I1011 10:48:54.211348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bd096860-a678-4b71-a23d-70ecd6b79a0d-frr-startup\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.212898 master-0 kubenswrapper[4790]: I1011 10:48:54.212852 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:48:54.214882 master-0 kubenswrapper[4790]: I1011 10:48:54.214858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd096860-a678-4b71-a23d-70ecd6b79a0d-metrics-certs\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.513418 master-0 kubenswrapper[4790]: I1011 10:48:54.513259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcq4\" (UniqueName: \"kubernetes.io/projected/bd096860-a678-4b71-a23d-70ecd6b79a0d-kube-api-access-6mcq4\") pod \"frr-k8s-5xkrb\" (UID: \"bd096860-a678-4b71-a23d-70ecd6b79a0d\") " pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.672172 master-0 kubenswrapper[4790]: I1011 10:48:54.672099 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:48:54.782794 master-0 kubenswrapper[4790]: I1011 10:48:54.782476 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w"] Oct 11 10:48:54.788541 master-0 kubenswrapper[4790]: W1011 10:48:54.788484 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7359c204_2acb_4c3b_b05f_2a124f3862fb.slice/crio-3eb4384d713a0d19539c60cd01fcdb0f30ce181686e9cc4b369634045bcc2e0c WatchSource:0}: Error finding container 3eb4384d713a0d19539c60cd01fcdb0f30ce181686e9cc4b369634045bcc2e0c: Status 404 returned error can't find the container with id 3eb4384d713a0d19539c60cd01fcdb0f30ce181686e9cc4b369634045bcc2e0c Oct 11 10:48:54.881227 master-0 kubenswrapper[4790]: I1011 10:48:54.881171 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8n7ld"] Oct 11 10:48:54.882632 master-0 kubenswrapper[4790]: I1011 10:48:54.882612 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8n7ld" Oct 11 10:48:54.885597 master-0 kubenswrapper[4790]: I1011 10:48:54.885539 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 11 10:48:54.886475 master-0 kubenswrapper[4790]: I1011 10:48:54.886421 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 11 10:48:54.886536 master-0 kubenswrapper[4790]: I1011 10:48:54.886487 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 11 10:48:54.948951 master-0 kubenswrapper[4790]: I1011 10:48:54.948857 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-rtr4h"] Oct 11 10:48:54.953996 master-0 kubenswrapper[4790]: I1011 10:48:54.953930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:54.958078 master-0 kubenswrapper[4790]: I1011 10:48:54.958025 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 11 10:48:54.962687 master-0 kubenswrapper[4790]: I1011 10:48:54.962612 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rtr4h"] Oct 11 10:48:55.035893 master-0 kubenswrapper[4790]: I1011 10:48:55.035754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bd4ff7d-5743-4ecb-86e8-72a738214533-metrics-certs\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.035893 master-0 kubenswrapper[4790]: I1011 10:48:55.035876 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bd4ff7d-5743-4ecb-86e8-72a738214533-cert\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.036123 master-0 kubenswrapper[4790]: I1011 10:48:55.035917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hk4r\" (UniqueName: \"kubernetes.io/projected/0bd4ff7d-5743-4ecb-86e8-72a738214533-kube-api-access-8hk4r\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.036123 master-0 kubenswrapper[4790]: I1011 10:48:55.035943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-metrics-certs\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.036123 master-0 kubenswrapper[4790]: I1011 10:48:55.035977 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm682\" (UniqueName: \"kubernetes.io/projected/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-kube-api-access-jm682\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.036123 master-0 kubenswrapper[4790]: I1011 10:48:55.036024 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.036123 master-0 kubenswrapper[4790]: I1011 10:48:55.036047 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-metallb-excludel2\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.137388 master-0 kubenswrapper[4790]: I1011 10:48:55.137301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hk4r\" (UniqueName: \"kubernetes.io/projected/0bd4ff7d-5743-4ecb-86e8-72a738214533-kube-api-access-8hk4r\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.137388 master-0 kubenswrapper[4790]: I1011 10:48:55.137380 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-metrics-certs\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.137671 master-0 kubenswrapper[4790]: I1011 10:48:55.137422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm682\" (UniqueName: \"kubernetes.io/projected/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-kube-api-access-jm682\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.137671 master-0 kubenswrapper[4790]: I1011 10:48:55.137474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.137671 master-0 kubenswrapper[4790]: I1011 10:48:55.137502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-metallb-excludel2\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.137671 master-0 kubenswrapper[4790]: I1011 10:48:55.137532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bd4ff7d-5743-4ecb-86e8-72a738214533-metrics-certs\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.137671 master-0 kubenswrapper[4790]: I1011 10:48:55.137593 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bd4ff7d-5743-4ecb-86e8-72a738214533-cert\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.137960 master-0 kubenswrapper[4790]: E1011 10:48:55.137879 4790 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 11 10:48:55.138105 master-0 kubenswrapper[4790]: E1011 10:48:55.138071 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist podName:7d3e23ec-dfa6-46d4-bf57-4e89ee459be5 nodeName:}" failed. No retries permitted until 2025-10-11 10:48:55.638028715 +0000 UTC m=+612.192489167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist") pod "speaker-8n7ld" (UID: "7d3e23ec-dfa6-46d4-bf57-4e89ee459be5") : secret "metallb-memberlist" not found Oct 11 10:48:55.138939 master-0 kubenswrapper[4790]: I1011 10:48:55.138888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-metallb-excludel2\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.140986 master-0 kubenswrapper[4790]: I1011 10:48:55.140579 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 11 10:48:55.140986 master-0 kubenswrapper[4790]: I1011 10:48:55.140890 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-metrics-certs\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.146783 master-0 kubenswrapper[4790]: I1011 10:48:55.142963 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bd4ff7d-5743-4ecb-86e8-72a738214533-metrics-certs\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.152088 master-0 kubenswrapper[4790]: I1011 10:48:55.152031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bd4ff7d-5743-4ecb-86e8-72a738214533-cert\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.165953 master-0 kubenswrapper[4790]: I1011 10:48:55.165867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm682\" (UniqueName: \"kubernetes.io/projected/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-kube-api-access-jm682\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.169887 master-0 kubenswrapper[4790]: I1011 10:48:55.169826 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hk4r\" (UniqueName: \"kubernetes.io/projected/0bd4ff7d-5743-4ecb-86e8-72a738214533-kube-api-access-8hk4r\") pod \"controller-68d546b9d8-rtr4h\" (UID: \"0bd4ff7d-5743-4ecb-86e8-72a738214533\") " pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.284824 master-0 kubenswrapper[4790]: I1011 10:48:55.284730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"2178a4b0af6e24d7fa63bc35d0bc782018b4fb72c46afa0209a738419753769f"} Oct 11 10:48:55.286023 master-0 kubenswrapper[4790]: I1011 10:48:55.285895 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" event={"ID":"7359c204-2acb-4c3b-b05f-2a124f3862fb","Type":"ContainerStarted","Data":"3eb4384d713a0d19539c60cd01fcdb0f30ce181686e9cc4b369634045bcc2e0c"} Oct 11 10:48:55.298879 master-0 kubenswrapper[4790]: I1011 10:48:55.298803 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:55.644672 master-0 kubenswrapper[4790]: I1011 10:48:55.644599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:55.645599 master-0 kubenswrapper[4790]: E1011 10:48:55.644842 4790 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 11 10:48:55.645599 master-0 kubenswrapper[4790]: E1011 10:48:55.644931 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist podName:7d3e23ec-dfa6-46d4-bf57-4e89ee459be5 nodeName:}" failed. No retries permitted until 2025-10-11 10:48:56.644907087 +0000 UTC m=+613.199367389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist") pod "speaker-8n7ld" (UID: "7d3e23ec-dfa6-46d4-bf57-4e89ee459be5") : secret "metallb-memberlist" not found Oct 11 10:48:55.725679 master-0 kubenswrapper[4790]: I1011 10:48:55.725546 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rtr4h"] Oct 11 10:48:55.730159 master-0 kubenswrapper[4790]: W1011 10:48:55.730042 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd4ff7d_5743_4ecb_86e8_72a738214533.slice/crio-b12d1120521b5dbe02d9fe1e03ed7553109738c5fbfa1f62efbc1a38cf4585a6 WatchSource:0}: Error finding container b12d1120521b5dbe02d9fe1e03ed7553109738c5fbfa1f62efbc1a38cf4585a6: Status 404 returned error can't find the container with id b12d1120521b5dbe02d9fe1e03ed7553109738c5fbfa1f62efbc1a38cf4585a6 Oct 11 10:48:56.319233 master-0 kubenswrapper[4790]: I1011 10:48:56.319153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rtr4h" event={"ID":"0bd4ff7d-5743-4ecb-86e8-72a738214533","Type":"ContainerStarted","Data":"67cf87ff9ce384175f6b582b655ebfb055948d2882a8f4b64d37bd68bfc474ef"} Oct 11 10:48:56.319233 master-0 kubenswrapper[4790]: I1011 10:48:56.319221 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rtr4h" event={"ID":"0bd4ff7d-5743-4ecb-86e8-72a738214533","Type":"ContainerStarted","Data":"b12d1120521b5dbe02d9fe1e03ed7553109738c5fbfa1f62efbc1a38cf4585a6"} Oct 11 10:48:56.391638 master-0 kubenswrapper[4790]: I1011 10:48:56.391587 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Oct 11 10:48:56.402261 master-0 kubenswrapper[4790]: I1011 10:48:56.402235 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Oct 11 10:48:56.661153 master-0 kubenswrapper[4790]: I1011 10:48:56.660936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:56.665583 master-0 kubenswrapper[4790]: I1011 10:48:56.665506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d3e23ec-dfa6-46d4-bf57-4e89ee459be5-memberlist\") pod \"speaker-8n7ld\" (UID: \"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5\") " pod="metallb-system/speaker-8n7ld" Oct 11 10:48:56.709132 master-0 kubenswrapper[4790]: I1011 10:48:56.709010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8n7ld" Oct 11 10:48:56.866733 master-0 kubenswrapper[4790]: I1011 10:48:56.860546 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6"] Oct 11 10:48:56.866733 master-0 kubenswrapper[4790]: I1011 10:48:56.861332 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:56.866733 master-0 kubenswrapper[4790]: I1011 10:48:56.864608 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8"] Oct 11 10:48:56.866733 master-0 kubenswrapper[4790]: I1011 10:48:56.865527 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" Oct 11 10:48:56.879744 master-0 kubenswrapper[4790]: I1011 10:48:56.878119 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7f4xb"] Oct 11 10:48:56.879744 master-0 kubenswrapper[4790]: I1011 10:48:56.878849 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:56.879744 master-0 kubenswrapper[4790]: I1011 10:48:56.879124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 11 10:48:56.885425 master-0 kubenswrapper[4790]: I1011 10:48:56.884926 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6"] Oct 11 10:48:56.890413 master-0 kubenswrapper[4790]: I1011 10:48:56.890358 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8"] Oct 11 10:48:56.972321 master-0 kubenswrapper[4790]: I1011 10:48:56.972215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-dbus-socket\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:56.972321 master-0 kubenswrapper[4790]: I1011 10:48:56.972268 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-ovs-socket\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:56.972321 master-0 kubenswrapper[4790]: I1011 10:48:56.972297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-nmstate-lock\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:56.972321 master-0 kubenswrapper[4790]: I1011 10:48:56.972317 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsrbt\" (UniqueName: \"kubernetes.io/projected/ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e-kube-api-access-nsrbt\") pod \"nmstate-metrics-fdff9cb8d-w4js8\" (UID: \"ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" Oct 11 10:48:56.972671 master-0 kubenswrapper[4790]: I1011 10:48:56.972350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ba695300-f2da-45e9-a825-81d462fc2d37-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-nf8q6\" (UID: \"ba695300-f2da-45e9-a825-81d462fc2d37\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:56.972671 master-0 kubenswrapper[4790]: I1011 10:48:56.972382 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmwhg\" (UniqueName: \"kubernetes.io/projected/0510dc20-c216-4f9a-b547-246dfdfc7d6f-kube-api-access-zmwhg\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:56.972671 master-0 kubenswrapper[4790]: I1011 10:48:56.972404 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vrl\" (UniqueName: \"kubernetes.io/projected/ba695300-f2da-45e9-a825-81d462fc2d37-kube-api-access-n7vrl\") pod \"nmstate-webhook-6cdbc54649-nf8q6\" (UID: \"ba695300-f2da-45e9-a825-81d462fc2d37\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:57.073647 master-0 kubenswrapper[4790]: I1011 10:48:57.073517 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsrbt\" (UniqueName: \"kubernetes.io/projected/ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e-kube-api-access-nsrbt\") pod \"nmstate-metrics-fdff9cb8d-w4js8\" (UID: \"ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" Oct 11 10:48:57.073647 master-0 kubenswrapper[4790]: I1011 10:48:57.073578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ba695300-f2da-45e9-a825-81d462fc2d37-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-nf8q6\" (UID: \"ba695300-f2da-45e9-a825-81d462fc2d37\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:57.073647 master-0 kubenswrapper[4790]: I1011 10:48:57.073627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmwhg\" (UniqueName: \"kubernetes.io/projected/0510dc20-c216-4f9a-b547-246dfdfc7d6f-kube-api-access-zmwhg\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.074011 master-0 kubenswrapper[4790]: I1011 10:48:57.073678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vrl\" (UniqueName: \"kubernetes.io/projected/ba695300-f2da-45e9-a825-81d462fc2d37-kube-api-access-n7vrl\") pod \"nmstate-webhook-6cdbc54649-nf8q6\" (UID: \"ba695300-f2da-45e9-a825-81d462fc2d37\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:57.074011 master-0 kubenswrapper[4790]: I1011 10:48:57.073749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-dbus-socket\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.074011 master-0 kubenswrapper[4790]: I1011 10:48:57.073771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-ovs-socket\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.074011 master-0 kubenswrapper[4790]: I1011 10:48:57.073802 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-nmstate-lock\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.074011 master-0 kubenswrapper[4790]: I1011 10:48:57.073875 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-nmstate-lock\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.076152 master-0 kubenswrapper[4790]: I1011 10:48:57.075087 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-dbus-socket\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.076152 master-0 kubenswrapper[4790]: I1011 10:48:57.075210 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0510dc20-c216-4f9a-b547-246dfdfc7d6f-ovs-socket\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.077942 master-0 kubenswrapper[4790]: I1011 10:48:57.077910 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ba695300-f2da-45e9-a825-81d462fc2d37-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-nf8q6\" (UID: \"ba695300-f2da-45e9-a825-81d462fc2d37\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:57.108809 master-0 kubenswrapper[4790]: I1011 10:48:57.108170 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vrl\" (UniqueName: \"kubernetes.io/projected/ba695300-f2da-45e9-a825-81d462fc2d37-kube-api-access-n7vrl\") pod \"nmstate-webhook-6cdbc54649-nf8q6\" (UID: \"ba695300-f2da-45e9-a825-81d462fc2d37\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:57.112995 master-0 kubenswrapper[4790]: I1011 10:48:57.111313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsrbt\" (UniqueName: \"kubernetes.io/projected/ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e-kube-api-access-nsrbt\") pod \"nmstate-metrics-fdff9cb8d-w4js8\" (UID: \"ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" Oct 11 10:48:57.117073 master-0 kubenswrapper[4790]: I1011 10:48:57.117024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmwhg\" (UniqueName: \"kubernetes.io/projected/0510dc20-c216-4f9a-b547-246dfdfc7d6f-kube-api-access-zmwhg\") pod \"nmstate-handler-7f4xb\" (UID: \"0510dc20-c216-4f9a-b547-246dfdfc7d6f\") " pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.295181 master-0 kubenswrapper[4790]: I1011 10:48:57.294593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:48:57.309386 master-0 kubenswrapper[4790]: I1011 10:48:57.309236 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" Oct 11 10:48:57.312056 master-0 kubenswrapper[4790]: I1011 10:48:57.311801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8n7ld" event={"ID":"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5","Type":"ContainerStarted","Data":"69e2fe06492b039c70785483990a1baaa42493149addceeb07e44f30a579bb4d"} Oct 11 10:48:57.312056 master-0 kubenswrapper[4790]: I1011 10:48:57.311865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8n7ld" event={"ID":"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5","Type":"ContainerStarted","Data":"5b1a1555bfc740741678499598d7c6f9d36dd1f4f09eb9059069f6fa6588fe82"} Oct 11 10:48:57.315498 master-0 kubenswrapper[4790]: I1011 10:48:57.315441 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:48:57.318478 master-0 kubenswrapper[4790]: I1011 10:48:57.318209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rtr4h" event={"ID":"0bd4ff7d-5743-4ecb-86e8-72a738214533","Type":"ContainerStarted","Data":"f1afc80009e4967865d987eb40153b9ddb8d76b97150006a9b8af8641bef245b"} Oct 11 10:48:57.318931 master-0 kubenswrapper[4790]: I1011 10:48:57.318903 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:48:57.349360 master-0 kubenswrapper[4790]: I1011 10:48:57.349269 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-rtr4h" podStartSLOduration=2.108411691 podStartE2EDuration="3.349245366s" podCreationTimestamp="2025-10-11 10:48:54 +0000 UTC" firstStartedPulling="2025-10-11 10:48:55.87910688 +0000 UTC m=+612.433567172" lastFinishedPulling="2025-10-11 10:48:57.119940555 +0000 UTC m=+613.674400847" observedRunningTime="2025-10-11 10:48:57.345063014 +0000 UTC m=+613.899523316" watchObservedRunningTime="2025-10-11 10:48:57.349245366 +0000 UTC m=+613.903705658" Oct 11 10:48:57.370781 master-0 kubenswrapper[4790]: W1011 10:48:57.370721 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0510dc20_c216_4f9a_b547_246dfdfc7d6f.slice/crio-0892d4e25e4dfe1e070f3c79c17a0dcfeed9d6bfe113b8d352b30ef5be152020 WatchSource:0}: Error finding container 0892d4e25e4dfe1e070f3c79c17a0dcfeed9d6bfe113b8d352b30ef5be152020: Status 404 returned error can't find the container with id 0892d4e25e4dfe1e070f3c79c17a0dcfeed9d6bfe113b8d352b30ef5be152020 Oct 11 10:48:57.561783 master-0 kubenswrapper[4790]: I1011 10:48:57.559282 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6"] Oct 11 10:48:57.568335 master-0 kubenswrapper[4790]: W1011 10:48:57.568295 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba695300_f2da_45e9_a825_81d462fc2d37.slice/crio-77bfba122366eee57220829ca169781ce3b68b5d7d9cd9cc3d0051c8ab99cfe5 WatchSource:0}: Error finding container 77bfba122366eee57220829ca169781ce3b68b5d7d9cd9cc3d0051c8ab99cfe5: Status 404 returned error can't find the container with id 77bfba122366eee57220829ca169781ce3b68b5d7d9cd9cc3d0051c8ab99cfe5 Oct 11 10:48:57.837743 master-0 kubenswrapper[4790]: I1011 10:48:57.837604 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8"] Oct 11 10:48:57.846307 master-0 kubenswrapper[4790]: W1011 10:48:57.845968 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca51cef5_fa00_4ea1_b7e6_e6e70bce9a0e.slice/crio-d280714a35416cff60c27ac7194da8e5d0a3728082b6dd69ff516ea6ceb8d117 WatchSource:0}: Error finding container d280714a35416cff60c27ac7194da8e5d0a3728082b6dd69ff516ea6ceb8d117: Status 404 returned error can't find the container with id d280714a35416cff60c27ac7194da8e5d0a3728082b6dd69ff516ea6ceb8d117 Oct 11 10:48:58.326812 master-0 kubenswrapper[4790]: I1011 10:48:58.326742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" event={"ID":"ba695300-f2da-45e9-a825-81d462fc2d37","Type":"ContainerStarted","Data":"77bfba122366eee57220829ca169781ce3b68b5d7d9cd9cc3d0051c8ab99cfe5"} Oct 11 10:48:58.328749 master-0 kubenswrapper[4790]: I1011 10:48:58.328694 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8n7ld" event={"ID":"7d3e23ec-dfa6-46d4-bf57-4e89ee459be5","Type":"ContainerStarted","Data":"6adc00087a258c176e89fff2b6e276c7b4340b462630d541afc18f6a4490ad96"} Oct 11 10:48:58.329260 master-0 kubenswrapper[4790]: I1011 10:48:58.328954 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8n7ld" Oct 11 10:48:58.330044 master-0 kubenswrapper[4790]: I1011 10:48:58.329983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" event={"ID":"ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e","Type":"ContainerStarted","Data":"d280714a35416cff60c27ac7194da8e5d0a3728082b6dd69ff516ea6ceb8d117"} Oct 11 10:48:58.331184 master-0 kubenswrapper[4790]: I1011 10:48:58.331123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7f4xb" event={"ID":"0510dc20-c216-4f9a-b547-246dfdfc7d6f","Type":"ContainerStarted","Data":"0892d4e25e4dfe1e070f3c79c17a0dcfeed9d6bfe113b8d352b30ef5be152020"} Oct 11 10:48:58.355056 master-0 kubenswrapper[4790]: I1011 10:48:58.354333 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8n7ld" podStartSLOduration=3.58132414 podStartE2EDuration="4.354308297s" podCreationTimestamp="2025-10-11 10:48:54 +0000 UTC" firstStartedPulling="2025-10-11 10:48:57.087134488 +0000 UTC m=+613.641594770" lastFinishedPulling="2025-10-11 10:48:57.860118635 +0000 UTC m=+614.414578927" observedRunningTime="2025-10-11 10:48:58.354062051 +0000 UTC m=+614.908522363" watchObservedRunningTime="2025-10-11 10:48:58.354308297 +0000 UTC m=+614.908768599" Oct 11 10:49:02.373058 master-0 kubenswrapper[4790]: I1011 10:49:02.372942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7f4xb" event={"ID":"0510dc20-c216-4f9a-b547-246dfdfc7d6f","Type":"ContainerStarted","Data":"896003a5d761753682474a3ff3764bf095553cd9ed7c77c90d2ab392d86fb6ae"} Oct 11 10:49:02.373727 master-0 kubenswrapper[4790]: I1011 10:49:02.373685 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:49:02.375417 master-0 kubenswrapper[4790]: I1011 10:49:02.375387 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" event={"ID":"ba695300-f2da-45e9-a825-81d462fc2d37","Type":"ContainerStarted","Data":"bef3fa469f056a37de1590df8388f77cc05599e20984e7f520dca92bad92c41b"} Oct 11 10:49:02.375814 master-0 kubenswrapper[4790]: I1011 10:49:02.375797 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:49:02.377635 master-0 kubenswrapper[4790]: I1011 10:49:02.377606 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd096860-a678-4b71-a23d-70ecd6b79a0d" containerID="d2b731d75bb78feada279b6f954eb95a844aede877e73198f6f2986f8451c9d3" exitCode=0 Oct 11 10:49:02.377690 master-0 kubenswrapper[4790]: I1011 10:49:02.377651 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerDied","Data":"d2b731d75bb78feada279b6f954eb95a844aede877e73198f6f2986f8451c9d3"} Oct 11 10:49:02.380548 master-0 kubenswrapper[4790]: I1011 10:49:02.380524 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" event={"ID":"ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e","Type":"ContainerStarted","Data":"448d09c3a57c8ae58711dcc6850bf0448a935fc4e2d58ddf78b2acc603abae13"} Oct 11 10:49:02.380593 master-0 kubenswrapper[4790]: I1011 10:49:02.380547 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" event={"ID":"ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e","Type":"ContainerStarted","Data":"40f7437a6ab48e4ce69e619a5055f1df33b4838e11dd33faf6a0a462201c215b"} Oct 11 10:49:02.382285 master-0 kubenswrapper[4790]: I1011 10:49:02.382253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" event={"ID":"7359c204-2acb-4c3b-b05f-2a124f3862fb","Type":"ContainerStarted","Data":"1cd4b9e757b66f6f559342df739d18f0bdbf9b68d541fd01f6bd9b58b2273cfe"} Oct 11 10:49:02.382614 master-0 kubenswrapper[4790]: I1011 10:49:02.382592 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:49:02.430875 master-0 kubenswrapper[4790]: I1011 10:49:02.430763 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7f4xb" podStartSLOduration=2.284630475 podStartE2EDuration="6.430732828s" podCreationTimestamp="2025-10-11 10:48:56 +0000 UTC" firstStartedPulling="2025-10-11 10:48:57.372897338 +0000 UTC m=+613.927357630" lastFinishedPulling="2025-10-11 10:49:01.518999691 +0000 UTC m=+618.073459983" observedRunningTime="2025-10-11 10:49:02.400539881 +0000 UTC m=+618.955000173" watchObservedRunningTime="2025-10-11 10:49:02.430732828 +0000 UTC m=+618.985193130" Oct 11 10:49:02.458264 master-0 kubenswrapper[4790]: I1011 10:49:02.458184 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" podStartSLOduration=3.7915816490000003 podStartE2EDuration="10.458160481s" podCreationTimestamp="2025-10-11 10:48:52 +0000 UTC" firstStartedPulling="2025-10-11 10:48:54.793917975 +0000 UTC m=+611.348378287" lastFinishedPulling="2025-10-11 10:49:01.460496827 +0000 UTC m=+618.014957119" observedRunningTime="2025-10-11 10:49:02.456539808 +0000 UTC m=+619.011000110" watchObservedRunningTime="2025-10-11 10:49:02.458160481 +0000 UTC m=+619.012620783" Oct 11 10:49:02.506374 master-0 kubenswrapper[4790]: I1011 10:49:02.505391 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-w4js8" podStartSLOduration=2.909938094 podStartE2EDuration="6.505369443s" podCreationTimestamp="2025-10-11 10:48:56 +0000 UTC" firstStartedPulling="2025-10-11 10:48:57.85354988 +0000 UTC m=+614.408010172" lastFinishedPulling="2025-10-11 10:49:01.448981219 +0000 UTC m=+618.003441521" observedRunningTime="2025-10-11 10:49:02.48129566 +0000 UTC m=+619.035755952" watchObservedRunningTime="2025-10-11 10:49:02.505369443 +0000 UTC m=+619.059829735" Oct 11 10:49:03.392443 master-0 kubenswrapper[4790]: I1011 10:49:03.392366 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd096860-a678-4b71-a23d-70ecd6b79a0d" containerID="a6e58df76386bb8e8ef07b134f1ffd543e0d2acceab2a91217be9721eba5984b" exitCode=0 Oct 11 10:49:03.395750 master-0 kubenswrapper[4790]: I1011 10:49:03.395257 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerDied","Data":"a6e58df76386bb8e8ef07b134f1ffd543e0d2acceab2a91217be9721eba5984b"} Oct 11 10:49:03.444845 master-0 kubenswrapper[4790]: I1011 10:49:03.444675 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" podStartSLOduration=3.554812534 podStartE2EDuration="7.444646626s" podCreationTimestamp="2025-10-11 10:48:56 +0000 UTC" firstStartedPulling="2025-10-11 10:48:57.575404992 +0000 UTC m=+614.129865284" lastFinishedPulling="2025-10-11 10:49:01.465239074 +0000 UTC m=+618.019699376" observedRunningTime="2025-10-11 10:49:02.506924185 +0000 UTC m=+619.061384497" watchObservedRunningTime="2025-10-11 10:49:03.444646626 +0000 UTC m=+619.999106958" Oct 11 10:49:04.404125 master-0 kubenswrapper[4790]: I1011 10:49:04.404051 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd096860-a678-4b71-a23d-70ecd6b79a0d" containerID="01c897c87d8f8a7cf6baba2200820b92059430adc1563df5a24ec06d698ab7a9" exitCode=0 Oct 11 10:49:04.404942 master-0 kubenswrapper[4790]: I1011 10:49:04.404268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerDied","Data":"01c897c87d8f8a7cf6baba2200820b92059430adc1563df5a24ec06d698ab7a9"} Oct 11 10:49:05.304111 master-0 kubenswrapper[4790]: I1011 10:49:05.304027 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-rtr4h" Oct 11 10:49:05.415207 master-0 kubenswrapper[4790]: I1011 10:49:05.415121 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"363cfa304dabd2abd3b423133ce71d5412423784d251e2d462483f789125e224"} Oct 11 10:49:05.415207 master-0 kubenswrapper[4790]: I1011 10:49:05.415180 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"846d6dea051489b20a5fdf0e50be9061753ef7a09012c2a6880726df214d6341"} Oct 11 10:49:05.415207 master-0 kubenswrapper[4790]: I1011 10:49:05.415195 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"3814e1169066eb5f133a731006069b1dc8aabf8a2041ded1e0e1ac3eea968aa0"} Oct 11 10:49:05.415207 master-0 kubenswrapper[4790]: I1011 10:49:05.415207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"437fd564f674ad40cbfea75214746496978529e1543cfb68813f78d6407c586d"} Oct 11 10:49:05.415207 master-0 kubenswrapper[4790]: I1011 10:49:05.415218 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"6e6bbd5fcab64812b3669f62cb3e668299cc24f61ae6fb0eabcfa62ac935a3b6"} Oct 11 10:49:05.415207 master-0 kubenswrapper[4790]: I1011 10:49:05.415228 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xkrb" event={"ID":"bd096860-a678-4b71-a23d-70ecd6b79a0d","Type":"ContainerStarted","Data":"5caed0c6041a80b902e0ba6a8c947468ac24bb32564bcf70b083f76af88ae8d4"} Oct 11 10:49:05.416203 master-0 kubenswrapper[4790]: I1011 10:49:05.416035 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:49:05.942751 master-0 kubenswrapper[4790]: I1011 10:49:05.942540 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5xkrb" podStartSLOduration=6.308042138 podStartE2EDuration="12.942504571s" podCreationTimestamp="2025-10-11 10:48:53 +0000 UTC" firstStartedPulling="2025-10-11 10:48:54.828230333 +0000 UTC m=+611.382690625" lastFinishedPulling="2025-10-11 10:49:01.462692766 +0000 UTC m=+618.017153058" observedRunningTime="2025-10-11 10:49:05.937275861 +0000 UTC m=+622.491736213" watchObservedRunningTime="2025-10-11 10:49:05.942504571 +0000 UTC m=+622.496964903" Oct 11 10:49:07.340546 master-0 kubenswrapper[4790]: I1011 10:49:07.340439 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7f4xb" Oct 11 10:49:08.069255 master-0 kubenswrapper[4790]: I1011 10:49:08.068550 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f9d445f57-w4nwq"] Oct 11 10:49:09.673214 master-0 kubenswrapper[4790]: I1011 10:49:09.673124 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:49:09.712254 master-0 kubenswrapper[4790]: I1011 10:49:09.712184 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:49:14.226136 master-0 kubenswrapper[4790]: I1011 10:49:14.226055 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-54x4w" Oct 11 10:49:14.675287 master-0 kubenswrapper[4790]: I1011 10:49:14.675195 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5xkrb" Oct 11 10:49:16.713960 master-0 kubenswrapper[4790]: I1011 10:49:16.713870 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8n7ld" Oct 11 10:49:17.301477 master-0 kubenswrapper[4790]: I1011 10:49:17.301394 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nf8q6" Oct 11 10:49:26.275332 master-0 kubenswrapper[4790]: I1011 10:49:26.275235 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-rxlsk"] Oct 11 10:49:26.277233 master-0 kubenswrapper[4790]: I1011 10:49:26.277174 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.281454 master-0 kubenswrapper[4790]: I1011 10:49:26.281404 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302283 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-registration-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302368 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-sys\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302409 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-node-plugin-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302440 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-device-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-metrics-cert\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302523 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-pod-volumes-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.302642 master-0 kubenswrapper[4790]: I1011 10:49:26.302645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-lvmd-config\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.303266 master-0 kubenswrapper[4790]: I1011 10:49:26.302686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-run-udev\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.303266 master-0 kubenswrapper[4790]: I1011 10:49:26.302757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-file-lock-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.303266 master-0 kubenswrapper[4790]: I1011 10:49:26.302835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-csi-plugin-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.303266 master-0 kubenswrapper[4790]: I1011 10:49:26.302903 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdl74\" (UniqueName: \"kubernetes.io/projected/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-kube-api-access-pdl74\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.317230 master-0 kubenswrapper[4790]: I1011 10:49:26.317047 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-rxlsk"] Oct 11 10:49:26.404522 master-0 kubenswrapper[4790]: I1011 10:49:26.404441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-lvmd-config\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.404522 master-0 kubenswrapper[4790]: I1011 10:49:26.404505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-run-udev\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.404522 master-0 kubenswrapper[4790]: I1011 10:49:26.404526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-file-lock-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404556 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-csi-plugin-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdl74\" (UniqueName: \"kubernetes.io/projected/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-kube-api-access-pdl74\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-registration-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404649 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-sys\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-node-plugin-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404690 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-device-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404906 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-metrics-cert\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-pod-volumes-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404942 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-lvmd-config\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.404942 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-run-udev\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.405042 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-device-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.405308 master-0 kubenswrapper[4790]: I1011 10:49:26.405150 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-sys\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.406085 master-0 kubenswrapper[4790]: I1011 10:49:26.405324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-file-lock-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.406085 master-0 kubenswrapper[4790]: I1011 10:49:26.405473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-registration-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.406085 master-0 kubenswrapper[4790]: I1011 10:49:26.405523 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-csi-plugin-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.406085 master-0 kubenswrapper[4790]: I1011 10:49:26.405721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-pod-volumes-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.406085 master-0 kubenswrapper[4790]: I1011 10:49:26.405931 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-node-plugin-dir\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.408527 master-0 kubenswrapper[4790]: I1011 10:49:26.408487 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-metrics-cert\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.423790 master-0 kubenswrapper[4790]: I1011 10:49:26.423676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdl74\" (UniqueName: \"kubernetes.io/projected/3d08adb1-c6cb-41b9-a68b-68a1e41b883a-kube-api-access-pdl74\") pod \"vg-manager-rxlsk\" (UID: \"3d08adb1-c6cb-41b9-a68b-68a1e41b883a\") " pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:26.609195 master-0 kubenswrapper[4790]: I1011 10:49:26.609102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:27.099886 master-0 kubenswrapper[4790]: I1011 10:49:27.099792 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-rxlsk"] Oct 11 10:49:27.114780 master-0 kubenswrapper[4790]: W1011 10:49:27.114245 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d08adb1_c6cb_41b9_a68b_68a1e41b883a.slice/crio-385f9d8067ff024e7eb1474454370d2badf2807493d5a215237ca9ab23771762 WatchSource:0}: Error finding container 385f9d8067ff024e7eb1474454370d2badf2807493d5a215237ca9ab23771762: Status 404 returned error can't find the container with id 385f9d8067ff024e7eb1474454370d2badf2807493d5a215237ca9ab23771762 Oct 11 10:49:27.578115 master-0 kubenswrapper[4790]: I1011 10:49:27.578032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rxlsk" event={"ID":"3d08adb1-c6cb-41b9-a68b-68a1e41b883a","Type":"ContainerStarted","Data":"a565c019c9824f5d08f5c05d055d268c1b884d168d1b37b88e96d612f7a7f10e"} Oct 11 10:49:27.578115 master-0 kubenswrapper[4790]: I1011 10:49:27.578104 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rxlsk" event={"ID":"3d08adb1-c6cb-41b9-a68b-68a1e41b883a","Type":"ContainerStarted","Data":"385f9d8067ff024e7eb1474454370d2badf2807493d5a215237ca9ab23771762"} Oct 11 10:49:27.615789 master-0 kubenswrapper[4790]: I1011 10:49:27.615649 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-rxlsk" podStartSLOduration=1.615620072 podStartE2EDuration="1.615620072s" podCreationTimestamp="2025-10-11 10:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:49:27.611150082 +0000 UTC m=+644.165610454" watchObservedRunningTime="2025-10-11 10:49:27.615620072 +0000 UTC m=+644.170080364" Oct 11 10:49:29.592820 master-0 kubenswrapper[4790]: I1011 10:49:29.592753 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-rxlsk_3d08adb1-c6cb-41b9-a68b-68a1e41b883a/vg-manager/0.log" Oct 11 10:49:29.592820 master-0 kubenswrapper[4790]: I1011 10:49:29.592813 4790 generic.go:334] "Generic (PLEG): container finished" podID="3d08adb1-c6cb-41b9-a68b-68a1e41b883a" containerID="a565c019c9824f5d08f5c05d055d268c1b884d168d1b37b88e96d612f7a7f10e" exitCode=1 Oct 11 10:49:29.593749 master-0 kubenswrapper[4790]: I1011 10:49:29.592852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rxlsk" event={"ID":"3d08adb1-c6cb-41b9-a68b-68a1e41b883a","Type":"ContainerDied","Data":"a565c019c9824f5d08f5c05d055d268c1b884d168d1b37b88e96d612f7a7f10e"} Oct 11 10:49:29.593749 master-0 kubenswrapper[4790]: I1011 10:49:29.593391 4790 scope.go:117] "RemoveContainer" containerID="a565c019c9824f5d08f5c05d055d268c1b884d168d1b37b88e96d612f7a7f10e" Oct 11 10:49:29.916234 master-0 kubenswrapper[4790]: I1011 10:49:29.916150 4790 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Oct 11 10:49:30.602918 master-0 kubenswrapper[4790]: I1011 10:49:30.602823 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-rxlsk_3d08adb1-c6cb-41b9-a68b-68a1e41b883a/vg-manager/0.log" Oct 11 10:49:30.602918 master-0 kubenswrapper[4790]: I1011 10:49:30.602917 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rxlsk" event={"ID":"3d08adb1-c6cb-41b9-a68b-68a1e41b883a","Type":"ContainerStarted","Data":"0769b027ed49aca1b276af8399b5e4fafd3aee07ccc9f93cd2ac4a7303e482df"} Oct 11 10:49:30.744922 master-0 kubenswrapper[4790]: I1011 10:49:30.743955 4790 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2025-10-11T10:49:29.916196642Z","Handler":null,"Name":""} Oct 11 10:49:30.747306 master-0 kubenswrapper[4790]: I1011 10:49:30.747246 4790 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Oct 11 10:49:30.747456 master-0 kubenswrapper[4790]: I1011 10:49:30.747320 4790 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Oct 11 10:49:33.107298 master-0 kubenswrapper[4790]: I1011 10:49:33.107217 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6f9d445f57-w4nwq" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" containerName="console" containerID="cri-o://3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0" gracePeriod=15 Oct 11 10:49:33.525789 master-0 kubenswrapper[4790]: I1011 10:49:33.525738 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f9d445f57-w4nwq_e299247b-558b-4b6c-9d7c-335475344fdc/console/0.log" Oct 11 10:49:33.526145 master-0 kubenswrapper[4790]: I1011 10:49:33.525842 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:49:33.625132 master-0 kubenswrapper[4790]: I1011 10:49:33.625083 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f9d445f57-w4nwq_e299247b-558b-4b6c-9d7c-335475344fdc/console/0.log" Oct 11 10:49:33.625765 master-0 kubenswrapper[4790]: I1011 10:49:33.625165 4790 generic.go:334] "Generic (PLEG): container finished" podID="e299247b-558b-4b6c-9d7c-335475344fdc" containerID="3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0" exitCode=2 Oct 11 10:49:33.625765 master-0 kubenswrapper[4790]: I1011 10:49:33.625213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9d445f57-w4nwq" event={"ID":"e299247b-558b-4b6c-9d7c-335475344fdc","Type":"ContainerDied","Data":"3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0"} Oct 11 10:49:33.625765 master-0 kubenswrapper[4790]: I1011 10:49:33.625259 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9d445f57-w4nwq" event={"ID":"e299247b-558b-4b6c-9d7c-335475344fdc","Type":"ContainerDied","Data":"c55a75264f621b83bca51a6b3abc1339de4396e41f1c001671f54d97f3acff21"} Oct 11 10:49:33.625765 master-0 kubenswrapper[4790]: I1011 10:49:33.625290 4790 scope.go:117] "RemoveContainer" containerID="3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0" Oct 11 10:49:33.625765 master-0 kubenswrapper[4790]: I1011 10:49:33.625496 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9d445f57-w4nwq" Oct 11 10:49:33.646512 master-0 kubenswrapper[4790]: I1011 10:49:33.646454 4790 scope.go:117] "RemoveContainer" containerID="3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0" Oct 11 10:49:33.647202 master-0 kubenswrapper[4790]: E1011 10:49:33.647143 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0\": container with ID starting with 3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0 not found: ID does not exist" containerID="3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0" Oct 11 10:49:33.647302 master-0 kubenswrapper[4790]: I1011 10:49:33.647211 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0"} err="failed to get container status \"3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0\": rpc error: code = NotFound desc = could not find container \"3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0\": container with ID starting with 3088103a931b6dbaca679e1253ed8ab435375376728343b7793fadb15e48f6e0 not found: ID does not exist" Oct 11 10:49:33.718437 master-0 kubenswrapper[4790]: I1011 10:49:33.718329 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-service-ca\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.718921 master-0 kubenswrapper[4790]: I1011 10:49:33.718594 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-oauth-serving-cert\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.718921 master-0 kubenswrapper[4790]: I1011 10:49:33.718665 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-serving-cert\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.718921 master-0 kubenswrapper[4790]: I1011 10:49:33.718747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-oauth-config\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.718921 master-0 kubenswrapper[4790]: I1011 10:49:33.718853 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-trusted-ca-bundle\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.718921 master-0 kubenswrapper[4790]: I1011 10:49:33.718922 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brv7r\" (UniqueName: \"kubernetes.io/projected/e299247b-558b-4b6c-9d7c-335475344fdc-kube-api-access-brv7r\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.719478 master-0 kubenswrapper[4790]: I1011 10:49:33.718989 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-console-config\") pod \"e299247b-558b-4b6c-9d7c-335475344fdc\" (UID: \"e299247b-558b-4b6c-9d7c-335475344fdc\") " Oct 11 10:49:33.719750 master-0 kubenswrapper[4790]: I1011 10:49:33.719470 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-service-ca" (OuterVolumeSpecName: "service-ca") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:49:33.720272 master-0 kubenswrapper[4790]: I1011 10:49:33.720240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:49:33.721180 master-0 kubenswrapper[4790]: I1011 10:49:33.721132 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-console-config" (OuterVolumeSpecName: "console-config") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:49:33.721344 master-0 kubenswrapper[4790]: I1011 10:49:33.721197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:49:33.726124 master-0 kubenswrapper[4790]: I1011 10:49:33.726085 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:49:33.727030 master-0 kubenswrapper[4790]: I1011 10:49:33.726938 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:49:33.727172 master-0 kubenswrapper[4790]: I1011 10:49:33.727107 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e299247b-558b-4b6c-9d7c-335475344fdc-kube-api-access-brv7r" (OuterVolumeSpecName: "kube-api-access-brv7r") pod "e299247b-558b-4b6c-9d7c-335475344fdc" (UID: "e299247b-558b-4b6c-9d7c-335475344fdc"). InnerVolumeSpecName "kube-api-access-brv7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:49:33.822099 master-0 kubenswrapper[4790]: I1011 10:49:33.822007 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-service-ca\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.822099 master-0 kubenswrapper[4790]: I1011 10:49:33.822098 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.822393 master-0 kubenswrapper[4790]: I1011 10:49:33.822126 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.822393 master-0 kubenswrapper[4790]: I1011 10:49:33.822148 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e299247b-558b-4b6c-9d7c-335475344fdc-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.822393 master-0 kubenswrapper[4790]: I1011 10:49:33.822168 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.822393 master-0 kubenswrapper[4790]: I1011 10:49:33.822189 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brv7r\" (UniqueName: \"kubernetes.io/projected/e299247b-558b-4b6c-9d7c-335475344fdc-kube-api-access-brv7r\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.822393 master-0 kubenswrapper[4790]: I1011 10:49:33.822209 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e299247b-558b-4b6c-9d7c-335475344fdc-console-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:49:33.970958 master-0 kubenswrapper[4790]: I1011 10:49:33.970872 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f9d445f57-w4nwq"] Oct 11 10:49:33.978683 master-0 kubenswrapper[4790]: I1011 10:49:33.978567 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f9d445f57-w4nwq"] Oct 11 10:49:34.309950 master-0 kubenswrapper[4790]: I1011 10:49:34.308520 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" path="/var/lib/kubelet/pods/e299247b-558b-4b6c-9d7c-335475344fdc/volumes" Oct 11 10:49:36.609867 master-0 kubenswrapper[4790]: I1011 10:49:36.609776 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:36.612817 master-0 kubenswrapper[4790]: I1011 10:49:36.612768 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:36.647912 master-0 kubenswrapper[4790]: I1011 10:49:36.647823 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:36.649262 master-0 kubenswrapper[4790]: I1011 10:49:36.649218 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-rxlsk" Oct 11 10:49:42.202338 master-0 kubenswrapper[4790]: I1011 10:49:42.202259 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5qz5r"] Oct 11 10:49:42.203079 master-0 kubenswrapper[4790]: E1011 10:49:42.202518 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" containerName="console" Oct 11 10:49:42.203079 master-0 kubenswrapper[4790]: I1011 10:49:42.202536 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" containerName="console" Oct 11 10:49:42.203079 master-0 kubenswrapper[4790]: I1011 10:49:42.202684 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e299247b-558b-4b6c-9d7c-335475344fdc" containerName="console" Oct 11 10:49:42.203245 master-0 kubenswrapper[4790]: I1011 10:49:42.203211 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:42.205656 master-0 kubenswrapper[4790]: I1011 10:49:42.205621 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 11 10:49:42.205912 master-0 kubenswrapper[4790]: I1011 10:49:42.205881 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 11 10:49:42.215814 master-0 kubenswrapper[4790]: I1011 10:49:42.215772 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5qz5r"] Oct 11 10:49:42.346045 master-0 kubenswrapper[4790]: I1011 10:49:42.345961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjsv8\" (UniqueName: \"kubernetes.io/projected/1e0eae2d-8fb0-4da2-b668-7e68e812682e-kube-api-access-xjsv8\") pod \"openstack-operator-index-5qz5r\" (UID: \"1e0eae2d-8fb0-4da2-b668-7e68e812682e\") " pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:42.447697 master-0 kubenswrapper[4790]: I1011 10:49:42.447599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjsv8\" (UniqueName: \"kubernetes.io/projected/1e0eae2d-8fb0-4da2-b668-7e68e812682e-kube-api-access-xjsv8\") pod \"openstack-operator-index-5qz5r\" (UID: \"1e0eae2d-8fb0-4da2-b668-7e68e812682e\") " pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:42.471140 master-0 kubenswrapper[4790]: I1011 10:49:42.470983 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjsv8\" (UniqueName: \"kubernetes.io/projected/1e0eae2d-8fb0-4da2-b668-7e68e812682e-kube-api-access-xjsv8\") pod \"openstack-operator-index-5qz5r\" (UID: \"1e0eae2d-8fb0-4da2-b668-7e68e812682e\") " pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:42.517555 master-0 kubenswrapper[4790]: I1011 10:49:42.517466 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:42.975048 master-0 kubenswrapper[4790]: I1011 10:49:42.974982 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5qz5r"] Oct 11 10:49:42.982002 master-0 kubenswrapper[4790]: W1011 10:49:42.981912 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e0eae2d_8fb0_4da2_b668_7e68e812682e.slice/crio-6c9151ee8df0131ac2502a5ebcb11017f3173ee51beb89b0ecfd331fd6eb9af5 WatchSource:0}: Error finding container 6c9151ee8df0131ac2502a5ebcb11017f3173ee51beb89b0ecfd331fd6eb9af5: Status 404 returned error can't find the container with id 6c9151ee8df0131ac2502a5ebcb11017f3173ee51beb89b0ecfd331fd6eb9af5 Oct 11 10:49:43.705325 master-0 kubenswrapper[4790]: I1011 10:49:43.705251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5qz5r" event={"ID":"1e0eae2d-8fb0-4da2-b668-7e68e812682e","Type":"ContainerStarted","Data":"6c9151ee8df0131ac2502a5ebcb11017f3173ee51beb89b0ecfd331fd6eb9af5"} Oct 11 10:49:51.765291 master-0 kubenswrapper[4790]: I1011 10:49:51.765210 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5qz5r" event={"ID":"1e0eae2d-8fb0-4da2-b668-7e68e812682e","Type":"ContainerStarted","Data":"0975185ed6e30305f667e10eec57bee48416fae36cef7e6d25229e9488efa83b"} Oct 11 10:49:51.795378 master-0 kubenswrapper[4790]: I1011 10:49:51.795201 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5qz5r" podStartSLOduration=2.099880572 podStartE2EDuration="9.795156702s" podCreationTimestamp="2025-10-11 10:49:42 +0000 UTC" firstStartedPulling="2025-10-11 10:49:42.983229129 +0000 UTC m=+659.537689421" lastFinishedPulling="2025-10-11 10:49:50.678505259 +0000 UTC m=+667.232965551" observedRunningTime="2025-10-11 10:49:51.793506017 +0000 UTC m=+668.347966309" watchObservedRunningTime="2025-10-11 10:49:51.795156702 +0000 UTC m=+668.349617034" Oct 11 10:49:52.518050 master-0 kubenswrapper[4790]: I1011 10:49:52.517840 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:52.518461 master-0 kubenswrapper[4790]: I1011 10:49:52.518141 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:49:52.549411 master-0 kubenswrapper[4790]: I1011 10:49:52.549332 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:50:02.550419 master-0 kubenswrapper[4790]: I1011 10:50:02.550340 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5qz5r" Oct 11 10:50:03.007995 master-0 kubenswrapper[4790]: I1011 10:50:03.007789 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-6-master-0"] Oct 11 10:50:03.008935 master-0 kubenswrapper[4790]: I1011 10:50:03.008901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.012905 master-0 kubenswrapper[4790]: I1011 10:50:03.012819 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-t28rg" Oct 11 10:50:03.027661 master-0 kubenswrapper[4790]: I1011 10:50:03.026775 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-6-master-0"] Oct 11 10:50:03.094175 master-0 kubenswrapper[4790]: I1011 10:50:03.094099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a0545c-12d2-49a0-be5e-17f472bac134-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.094790 master-0 kubenswrapper[4790]: I1011 10:50:03.094746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a0545c-12d2-49a0-be5e-17f472bac134-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.196224 master-0 kubenswrapper[4790]: I1011 10:50:03.196092 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a0545c-12d2-49a0-be5e-17f472bac134-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.196583 master-0 kubenswrapper[4790]: I1011 10:50:03.196254 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a0545c-12d2-49a0-be5e-17f472bac134-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.196583 master-0 kubenswrapper[4790]: I1011 10:50:03.196384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a0545c-12d2-49a0-be5e-17f472bac134-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.219698 master-0 kubenswrapper[4790]: I1011 10:50:03.219609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a0545c-12d2-49a0-be5e-17f472bac134-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:03.336579 master-0 kubenswrapper[4790]: I1011 10:50:03.336449 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:04.003652 master-0 kubenswrapper[4790]: W1011 10:50:04.003559 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod14a0545c_12d2_49a0_be5e_17f472bac134.slice/crio-b8dcb1e6a79f758d4b56351011463269e2f84dae8484b90ceaaf4bc932ecb153 WatchSource:0}: Error finding container b8dcb1e6a79f758d4b56351011463269e2f84dae8484b90ceaaf4bc932ecb153: Status 404 returned error can't find the container with id b8dcb1e6a79f758d4b56351011463269e2f84dae8484b90ceaaf4bc932ecb153 Oct 11 10:50:04.039289 master-0 kubenswrapper[4790]: I1011 10:50:04.039050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-6-master-0"] Oct 11 10:50:04.862692 master-0 kubenswrapper[4790]: I1011 10:50:04.862512 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-0" event={"ID":"14a0545c-12d2-49a0-be5e-17f472bac134","Type":"ContainerStarted","Data":"1b18bd815a4c78ab3d993af09213efb805e050af67596981f577c078fd8793c8"} Oct 11 10:50:04.862692 master-0 kubenswrapper[4790]: I1011 10:50:04.862582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-0" event={"ID":"14a0545c-12d2-49a0-be5e-17f472bac134","Type":"ContainerStarted","Data":"b8dcb1e6a79f758d4b56351011463269e2f84dae8484b90ceaaf4bc932ecb153"} Oct 11 10:50:04.893365 master-0 kubenswrapper[4790]: I1011 10:50:04.893234 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-6-master-0" podStartSLOduration=2.893199858 podStartE2EDuration="2.893199858s" podCreationTimestamp="2025-10-11 10:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:50:04.89293016 +0000 UTC m=+681.447390462" watchObservedRunningTime="2025-10-11 10:50:04.893199858 +0000 UTC m=+681.447660190" Oct 11 10:50:05.872569 master-0 kubenswrapper[4790]: I1011 10:50:05.872454 4790 generic.go:334] "Generic (PLEG): container finished" podID="14a0545c-12d2-49a0-be5e-17f472bac134" containerID="1b18bd815a4c78ab3d993af09213efb805e050af67596981f577c078fd8793c8" exitCode=0 Oct 11 10:50:05.872569 master-0 kubenswrapper[4790]: I1011 10:50:05.872526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-0" event={"ID":"14a0545c-12d2-49a0-be5e-17f472bac134","Type":"ContainerDied","Data":"1b18bd815a4c78ab3d993af09213efb805e050af67596981f577c078fd8793c8"} Oct 11 10:50:07.301744 master-0 kubenswrapper[4790]: I1011 10:50:07.301665 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:07.458799 master-0 kubenswrapper[4790]: I1011 10:50:07.458682 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a0545c-12d2-49a0-be5e-17f472bac134-kubelet-dir\") pod \"14a0545c-12d2-49a0-be5e-17f472bac134\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " Oct 11 10:50:07.458799 master-0 kubenswrapper[4790]: I1011 10:50:07.458825 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a0545c-12d2-49a0-be5e-17f472bac134-kube-api-access\") pod \"14a0545c-12d2-49a0-be5e-17f472bac134\" (UID: \"14a0545c-12d2-49a0-be5e-17f472bac134\") " Oct 11 10:50:07.459607 master-0 kubenswrapper[4790]: I1011 10:50:07.459021 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14a0545c-12d2-49a0-be5e-17f472bac134-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "14a0545c-12d2-49a0-be5e-17f472bac134" (UID: "14a0545c-12d2-49a0-be5e-17f472bac134"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:50:07.459897 master-0 kubenswrapper[4790]: I1011 10:50:07.459840 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14a0545c-12d2-49a0-be5e-17f472bac134-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Oct 11 10:50:07.464434 master-0 kubenswrapper[4790]: I1011 10:50:07.464356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a0545c-12d2-49a0-be5e-17f472bac134-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "14a0545c-12d2-49a0-be5e-17f472bac134" (UID: "14a0545c-12d2-49a0-be5e-17f472bac134"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:50:07.560975 master-0 kubenswrapper[4790]: I1011 10:50:07.560906 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14a0545c-12d2-49a0-be5e-17f472bac134-kube-api-access\") on node \"master-0\" DevicePath \"\"" Oct 11 10:50:07.887887 master-0 kubenswrapper[4790]: I1011 10:50:07.887647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-0" event={"ID":"14a0545c-12d2-49a0-be5e-17f472bac134","Type":"ContainerDied","Data":"b8dcb1e6a79f758d4b56351011463269e2f84dae8484b90ceaaf4bc932ecb153"} Oct 11 10:50:07.887887 master-0 kubenswrapper[4790]: I1011 10:50:07.887752 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8dcb1e6a79f758d4b56351011463269e2f84dae8484b90ceaaf4bc932ecb153" Oct 11 10:50:07.887887 master-0 kubenswrapper[4790]: I1011 10:50:07.887767 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-0" Oct 11 10:50:22.316602 master-0 kubenswrapper[4790]: I1011 10:50:22.316523 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-j48hd"] Oct 11 10:50:22.317508 master-0 kubenswrapper[4790]: E1011 10:50:22.316867 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a0545c-12d2-49a0-be5e-17f472bac134" containerName="pruner" Oct 11 10:50:22.317508 master-0 kubenswrapper[4790]: I1011 10:50:22.316887 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a0545c-12d2-49a0-be5e-17f472bac134" containerName="pruner" Oct 11 10:50:22.317508 master-0 kubenswrapper[4790]: I1011 10:50:22.317033 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a0545c-12d2-49a0-be5e-17f472bac134" containerName="pruner" Oct 11 10:50:22.318276 master-0 kubenswrapper[4790]: I1011 10:50:22.317931 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:22.357124 master-0 kubenswrapper[4790]: I1011 10:50:22.357044 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-j48hd"] Oct 11 10:50:22.402318 master-0 kubenswrapper[4790]: I1011 10:50:22.399921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgkc\" (UniqueName: \"kubernetes.io/projected/90e79615-f456-4c3a-9e00-9683d29da694-kube-api-access-lkgkc\") pod \"openstack-operator-controller-operator-688d597459-j48hd\" (UID: \"90e79615-f456-4c3a-9e00-9683d29da694\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:22.501105 master-0 kubenswrapper[4790]: I1011 10:50:22.501028 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgkc\" (UniqueName: \"kubernetes.io/projected/90e79615-f456-4c3a-9e00-9683d29da694-kube-api-access-lkgkc\") pod \"openstack-operator-controller-operator-688d597459-j48hd\" (UID: \"90e79615-f456-4c3a-9e00-9683d29da694\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:22.525955 master-0 kubenswrapper[4790]: I1011 10:50:22.525906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgkc\" (UniqueName: \"kubernetes.io/projected/90e79615-f456-4c3a-9e00-9683d29da694-kube-api-access-lkgkc\") pod \"openstack-operator-controller-operator-688d597459-j48hd\" (UID: \"90e79615-f456-4c3a-9e00-9683d29da694\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:22.636597 master-0 kubenswrapper[4790]: I1011 10:50:22.636415 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:23.086441 master-0 kubenswrapper[4790]: I1011 10:50:23.086376 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-j48hd"] Oct 11 10:50:24.027285 master-0 kubenswrapper[4790]: I1011 10:50:24.027222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" event={"ID":"90e79615-f456-4c3a-9e00-9683d29da694","Type":"ContainerStarted","Data":"866fb0186a0f425bf7f92274c83bb58565b7bd8634ccccde3ba085ee3d405915"} Oct 11 10:50:27.051263 master-0 kubenswrapper[4790]: I1011 10:50:27.051174 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" event={"ID":"90e79615-f456-4c3a-9e00-9683d29da694","Type":"ContainerStarted","Data":"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c"} Oct 11 10:50:29.068922 master-0 kubenswrapper[4790]: I1011 10:50:29.068799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" event={"ID":"90e79615-f456-4c3a-9e00-9683d29da694","Type":"ContainerStarted","Data":"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61"} Oct 11 10:50:29.069966 master-0 kubenswrapper[4790]: I1011 10:50:29.069115 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:29.120182 master-0 kubenswrapper[4790]: I1011 10:50:29.120070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" podStartSLOduration=2.012448059 podStartE2EDuration="7.120040251s" podCreationTimestamp="2025-10-11 10:50:22 +0000 UTC" firstStartedPulling="2025-10-11 10:50:23.097174909 +0000 UTC m=+699.651635201" lastFinishedPulling="2025-10-11 10:50:28.204767101 +0000 UTC m=+704.759227393" observedRunningTime="2025-10-11 10:50:29.11888933 +0000 UTC m=+705.673349652" watchObservedRunningTime="2025-10-11 10:50:29.120040251 +0000 UTC m=+705.674500553" Oct 11 10:50:32.643043 master-0 kubenswrapper[4790]: I1011 10:50:32.642064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:35.007196 master-0 kubenswrapper[4790]: I1011 10:50:35.007129 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7"] Oct 11 10:50:35.008150 master-0 kubenswrapper[4790]: I1011 10:50:35.008126 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:35.047303 master-0 kubenswrapper[4790]: I1011 10:50:35.047240 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7"] Oct 11 10:50:35.126152 master-0 kubenswrapper[4790]: I1011 10:50:35.125960 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8qh\" (UniqueName: \"kubernetes.io/projected/0b3dee2f-71f4-480b-a67a-ac73ab42d1f9-kube-api-access-vx8qh\") pod \"openstack-operator-controller-operator-566868fd7b-vpll7\" (UID: \"0b3dee2f-71f4-480b-a67a-ac73ab42d1f9\") " pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:35.227126 master-0 kubenswrapper[4790]: I1011 10:50:35.227060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8qh\" (UniqueName: \"kubernetes.io/projected/0b3dee2f-71f4-480b-a67a-ac73ab42d1f9-kube-api-access-vx8qh\") pod \"openstack-operator-controller-operator-566868fd7b-vpll7\" (UID: \"0b3dee2f-71f4-480b-a67a-ac73ab42d1f9\") " pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:35.247664 master-0 kubenswrapper[4790]: I1011 10:50:35.247606 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8qh\" (UniqueName: \"kubernetes.io/projected/0b3dee2f-71f4-480b-a67a-ac73ab42d1f9-kube-api-access-vx8qh\") pod \"openstack-operator-controller-operator-566868fd7b-vpll7\" (UID: \"0b3dee2f-71f4-480b-a67a-ac73ab42d1f9\") " pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:35.340527 master-0 kubenswrapper[4790]: I1011 10:50:35.340445 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:35.788768 master-0 kubenswrapper[4790]: I1011 10:50:35.788555 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7"] Oct 11 10:50:35.800231 master-0 kubenswrapper[4790]: W1011 10:50:35.800150 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b3dee2f_71f4_480b_a67a_ac73ab42d1f9.slice/crio-070e76e069764f246ee6726f4516e958738d349cd2b2e24c51954ba693d16100 WatchSource:0}: Error finding container 070e76e069764f246ee6726f4516e958738d349cd2b2e24c51954ba693d16100: Status 404 returned error can't find the container with id 070e76e069764f246ee6726f4516e958738d349cd2b2e24c51954ba693d16100 Oct 11 10:50:36.122879 master-0 kubenswrapper[4790]: I1011 10:50:36.122821 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" event={"ID":"0b3dee2f-71f4-480b-a67a-ac73ab42d1f9","Type":"ContainerStarted","Data":"7e764480a1439c202cd12594886e48ed0490c259faab7593e59ab2f0350d996d"} Oct 11 10:50:36.122879 master-0 kubenswrapper[4790]: I1011 10:50:36.122887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" event={"ID":"0b3dee2f-71f4-480b-a67a-ac73ab42d1f9","Type":"ContainerStarted","Data":"c6dd853fb6f82b8a00a3ba262160819580a535d21a365d783b51dee534d0d855"} Oct 11 10:50:36.122879 master-0 kubenswrapper[4790]: I1011 10:50:36.122903 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" event={"ID":"0b3dee2f-71f4-480b-a67a-ac73ab42d1f9","Type":"ContainerStarted","Data":"070e76e069764f246ee6726f4516e958738d349cd2b2e24c51954ba693d16100"} Oct 11 10:50:36.123652 master-0 kubenswrapper[4790]: I1011 10:50:36.123018 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:36.179563 master-0 kubenswrapper[4790]: I1011 10:50:36.179423 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" podStartSLOduration=2.179391697 podStartE2EDuration="2.179391697s" podCreationTimestamp="2025-10-11 10:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:50:36.171119483 +0000 UTC m=+712.725579855" watchObservedRunningTime="2025-10-11 10:50:36.179391697 +0000 UTC m=+712.733852029" Oct 11 10:50:45.344049 master-0 kubenswrapper[4790]: I1011 10:50:45.343971 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-566868fd7b-vpll7" Oct 11 10:50:45.458221 master-0 kubenswrapper[4790]: I1011 10:50:45.458144 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-j48hd"] Oct 11 10:50:45.458513 master-0 kubenswrapper[4790]: I1011 10:50:45.458437 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="operator" containerID="cri-o://65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c" gracePeriod=10 Oct 11 10:50:45.458663 master-0 kubenswrapper[4790]: I1011 10:50:45.458534 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="kube-rbac-proxy" containerID="cri-o://d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61" gracePeriod=10 Oct 11 10:50:45.900846 master-0 kubenswrapper[4790]: I1011 10:50:45.900771 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:45.977184 master-0 kubenswrapper[4790]: I1011 10:50:45.977015 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkgkc\" (UniqueName: \"kubernetes.io/projected/90e79615-f456-4c3a-9e00-9683d29da694-kube-api-access-lkgkc\") pod \"90e79615-f456-4c3a-9e00-9683d29da694\" (UID: \"90e79615-f456-4c3a-9e00-9683d29da694\") " Oct 11 10:50:45.980513 master-0 kubenswrapper[4790]: I1011 10:50:45.980429 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e79615-f456-4c3a-9e00-9683d29da694-kube-api-access-lkgkc" (OuterVolumeSpecName: "kube-api-access-lkgkc") pod "90e79615-f456-4c3a-9e00-9683d29da694" (UID: "90e79615-f456-4c3a-9e00-9683d29da694"). InnerVolumeSpecName "kube-api-access-lkgkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:50:46.078609 master-0 kubenswrapper[4790]: I1011 10:50:46.078509 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkgkc\" (UniqueName: \"kubernetes.io/projected/90e79615-f456-4c3a-9e00-9683d29da694-kube-api-access-lkgkc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:50:46.186477 master-0 kubenswrapper[4790]: I1011 10:50:46.186342 4790 generic.go:334] "Generic (PLEG): container finished" podID="90e79615-f456-4c3a-9e00-9683d29da694" containerID="d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61" exitCode=0 Oct 11 10:50:46.186477 master-0 kubenswrapper[4790]: I1011 10:50:46.186428 4790 generic.go:334] "Generic (PLEG): container finished" podID="90e79615-f456-4c3a-9e00-9683d29da694" containerID="65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c" exitCode=0 Oct 11 10:50:46.186477 master-0 kubenswrapper[4790]: I1011 10:50:46.186430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" event={"ID":"90e79615-f456-4c3a-9e00-9683d29da694","Type":"ContainerDied","Data":"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61"} Oct 11 10:50:46.186915 master-0 kubenswrapper[4790]: I1011 10:50:46.186504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" event={"ID":"90e79615-f456-4c3a-9e00-9683d29da694","Type":"ContainerDied","Data":"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c"} Oct 11 10:50:46.186915 master-0 kubenswrapper[4790]: I1011 10:50:46.186398 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" Oct 11 10:50:46.186915 master-0 kubenswrapper[4790]: I1011 10:50:46.186544 4790 scope.go:117] "RemoveContainer" containerID="d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61" Oct 11 10:50:46.186915 master-0 kubenswrapper[4790]: I1011 10:50:46.186520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-j48hd" event={"ID":"90e79615-f456-4c3a-9e00-9683d29da694","Type":"ContainerDied","Data":"866fb0186a0f425bf7f92274c83bb58565b7bd8634ccccde3ba085ee3d405915"} Oct 11 10:50:46.208951 master-0 kubenswrapper[4790]: I1011 10:50:46.208809 4790 scope.go:117] "RemoveContainer" containerID="65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c" Oct 11 10:50:46.235578 master-0 kubenswrapper[4790]: I1011 10:50:46.235533 4790 scope.go:117] "RemoveContainer" containerID="d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61" Oct 11 10:50:46.236236 master-0 kubenswrapper[4790]: E1011 10:50:46.236195 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61\": container with ID starting with d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61 not found: ID does not exist" containerID="d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61" Oct 11 10:50:46.236313 master-0 kubenswrapper[4790]: I1011 10:50:46.236241 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61"} err="failed to get container status \"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61\": rpc error: code = NotFound desc = could not find container \"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61\": container with ID starting with d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61 not found: ID does not exist" Oct 11 10:50:46.236313 master-0 kubenswrapper[4790]: I1011 10:50:46.236268 4790 scope.go:117] "RemoveContainer" containerID="65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c" Oct 11 10:50:46.236764 master-0 kubenswrapper[4790]: E1011 10:50:46.236727 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c\": container with ID starting with 65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c not found: ID does not exist" containerID="65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c" Oct 11 10:50:46.236827 master-0 kubenswrapper[4790]: I1011 10:50:46.236754 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c"} err="failed to get container status \"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c\": rpc error: code = NotFound desc = could not find container \"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c\": container with ID starting with 65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c not found: ID does not exist" Oct 11 10:50:46.236827 master-0 kubenswrapper[4790]: I1011 10:50:46.236777 4790 scope.go:117] "RemoveContainer" containerID="d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61" Oct 11 10:50:46.237426 master-0 kubenswrapper[4790]: I1011 10:50:46.237307 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61"} err="failed to get container status \"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61\": rpc error: code = NotFound desc = could not find container \"d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61\": container with ID starting with d21bfc050573bee2ce236ab06a1a4750d84b960c474c7f0184f4e2764c771e61 not found: ID does not exist" Oct 11 10:50:46.237512 master-0 kubenswrapper[4790]: I1011 10:50:46.237470 4790 scope.go:117] "RemoveContainer" containerID="65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c" Oct 11 10:50:46.239249 master-0 kubenswrapper[4790]: I1011 10:50:46.239207 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c"} err="failed to get container status \"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c\": rpc error: code = NotFound desc = could not find container \"65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c\": container with ID starting with 65b1d808f699a2b6c607b7d8b40f13329503df25dad386f96458868225389a4c not found: ID does not exist" Oct 11 10:50:46.241014 master-0 kubenswrapper[4790]: I1011 10:50:46.240956 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-j48hd"] Oct 11 10:50:46.256063 master-0 kubenswrapper[4790]: I1011 10:50:46.255983 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-j48hd"] Oct 11 10:50:46.300681 master-0 kubenswrapper[4790]: I1011 10:50:46.300619 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e79615-f456-4c3a-9e00-9683d29da694" path="/var/lib/kubelet/pods/90e79615-f456-4c3a-9e00-9683d29da694/volumes" Oct 11 10:51:56.741521 master-0 kubenswrapper[4790]: I1011 10:51:56.741373 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm"] Oct 11 10:51:56.742340 master-0 kubenswrapper[4790]: E1011 10:51:56.741865 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="kube-rbac-proxy" Oct 11 10:51:56.742340 master-0 kubenswrapper[4790]: I1011 10:51:56.741959 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="kube-rbac-proxy" Oct 11 10:51:56.742340 master-0 kubenswrapper[4790]: E1011 10:51:56.741981 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="operator" Oct 11 10:51:56.742340 master-0 kubenswrapper[4790]: I1011 10:51:56.741990 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="operator" Oct 11 10:51:56.742340 master-0 kubenswrapper[4790]: I1011 10:51:56.742218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="operator" Oct 11 10:51:56.742340 master-0 kubenswrapper[4790]: I1011 10:51:56.742239 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e79615-f456-4c3a-9e00-9683d29da694" containerName="kube-rbac-proxy" Oct 11 10:51:56.746355 master-0 kubenswrapper[4790]: I1011 10:51:56.744295 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:51:56.776274 master-0 kubenswrapper[4790]: I1011 10:51:56.776175 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm"] Oct 11 10:51:56.909242 master-0 kubenswrapper[4790]: I1011 10:51:56.909158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsm2k\" (UniqueName: \"kubernetes.io/projected/f74275f9-5962-46ed-bbf9-9ca7dabea845-kube-api-access-vsm2k\") pod \"barbican-operator-controller-manager-658c7b459c-fzlrm\" (UID: \"f74275f9-5962-46ed-bbf9-9ca7dabea845\") " pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:51:56.920348 master-0 kubenswrapper[4790]: I1011 10:51:56.920286 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2"] Oct 11 10:51:56.921323 master-0 kubenswrapper[4790]: I1011 10:51:56.921294 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:56.924154 master-0 kubenswrapper[4790]: I1011 10:51:56.924104 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 11 10:51:56.941040 master-0 kubenswrapper[4790]: I1011 10:51:56.940953 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2"] Oct 11 10:51:56.941907 master-0 kubenswrapper[4790]: I1011 10:51:56.941874 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:51:56.948078 master-0 kubenswrapper[4790]: I1011 10:51:56.948027 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2"] Oct 11 10:51:56.968959 master-0 kubenswrapper[4790]: I1011 10:51:56.968901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2"] Oct 11 10:51:57.019173 master-0 kubenswrapper[4790]: I1011 10:51:57.013572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-cert\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.019173 master-0 kubenswrapper[4790]: I1011 10:51:57.013654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9ns\" (UniqueName: \"kubernetes.io/projected/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-kube-api-access-nc9ns\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.019173 master-0 kubenswrapper[4790]: I1011 10:51:57.013774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsm2k\" (UniqueName: \"kubernetes.io/projected/f74275f9-5962-46ed-bbf9-9ca7dabea845-kube-api-access-vsm2k\") pod \"barbican-operator-controller-manager-658c7b459c-fzlrm\" (UID: \"f74275f9-5962-46ed-bbf9-9ca7dabea845\") " pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:51:57.051331 master-0 kubenswrapper[4790]: I1011 10:51:57.051276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsm2k\" (UniqueName: \"kubernetes.io/projected/f74275f9-5962-46ed-bbf9-9ca7dabea845-kube-api-access-vsm2k\") pod \"barbican-operator-controller-manager-658c7b459c-fzlrm\" (UID: \"f74275f9-5962-46ed-bbf9-9ca7dabea845\") " pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:51:57.081339 master-0 kubenswrapper[4790]: I1011 10:51:57.080653 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:51:57.115572 master-0 kubenswrapper[4790]: I1011 10:51:57.115398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-cert\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.115572 master-0 kubenswrapper[4790]: I1011 10:51:57.115457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6fhn\" (UniqueName: \"kubernetes.io/projected/7c307dd6-17af-4fc5-8b19-d6fd59f46d04-kube-api-access-t6fhn\") pod \"horizon-operator-controller-manager-54969ff695-mxpp2\" (UID: \"7c307dd6-17af-4fc5-8b19-d6fd59f46d04\") " pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:51:57.115572 master-0 kubenswrapper[4790]: I1011 10:51:57.115496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9ns\" (UniqueName: \"kubernetes.io/projected/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-kube-api-access-nc9ns\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.117402 master-0 kubenswrapper[4790]: E1011 10:51:57.115651 4790 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 11 10:51:57.117402 master-0 kubenswrapper[4790]: E1011 10:51:57.115789 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-cert podName:19f1cdae-2bd6-42f2-aedc-7da343eeab3f nodeName:}" failed. No retries permitted until 2025-10-11 10:51:57.615760913 +0000 UTC m=+794.170221275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-cert") pod "infra-operator-controller-manager-d68fd5cdf-2dkw2" (UID: "19f1cdae-2bd6-42f2-aedc-7da343eeab3f") : secret "infra-operator-webhook-server-cert" not found Oct 11 10:51:57.121082 master-0 kubenswrapper[4790]: I1011 10:51:57.121040 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d"] Oct 11 10:51:57.129876 master-0 kubenswrapper[4790]: I1011 10:51:57.129843 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:51:57.144493 master-0 kubenswrapper[4790]: I1011 10:51:57.144403 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d"] Oct 11 10:51:57.166275 master-0 kubenswrapper[4790]: I1011 10:51:57.166044 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9ns\" (UniqueName: \"kubernetes.io/projected/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-kube-api-access-nc9ns\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.203434 master-0 kubenswrapper[4790]: I1011 10:51:57.201612 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf"] Oct 11 10:51:57.203434 master-0 kubenswrapper[4790]: I1011 10:51:57.202949 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.206240 master-0 kubenswrapper[4790]: I1011 10:51:57.205916 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 11 10:51:57.216516 master-0 kubenswrapper[4790]: I1011 10:51:57.216423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6fhn\" (UniqueName: \"kubernetes.io/projected/7c307dd6-17af-4fc5-8b19-d6fd59f46d04-kube-api-access-t6fhn\") pod \"horizon-operator-controller-manager-54969ff695-mxpp2\" (UID: \"7c307dd6-17af-4fc5-8b19-d6fd59f46d04\") " pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:51:57.216856 master-0 kubenswrapper[4790]: I1011 10:51:57.216558 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntnqz\" (UniqueName: \"kubernetes.io/projected/5409d60b-bd64-4ef3-9531-b264971d7d85-kube-api-access-ntnqz\") pod \"nova-operator-controller-manager-64487ccd4d-fzt8d\" (UID: \"5409d60b-bd64-4ef3-9531-b264971d7d85\") " pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:51:57.238477 master-0 kubenswrapper[4790]: I1011 10:51:57.236932 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf"] Oct 11 10:51:57.256128 master-0 kubenswrapper[4790]: I1011 10:51:57.254721 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg"] Oct 11 10:51:57.256332 master-0 kubenswrapper[4790]: I1011 10:51:57.256310 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:51:57.281982 master-0 kubenswrapper[4790]: I1011 10:51:57.281850 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg"] Oct 11 10:51:57.288205 master-0 kubenswrapper[4790]: I1011 10:51:57.288164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6fhn\" (UniqueName: \"kubernetes.io/projected/7c307dd6-17af-4fc5-8b19-d6fd59f46d04-kube-api-access-t6fhn\") pod \"horizon-operator-controller-manager-54969ff695-mxpp2\" (UID: \"7c307dd6-17af-4fc5-8b19-d6fd59f46d04\") " pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:51:57.301758 master-0 kubenswrapper[4790]: I1011 10:51:57.301393 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc"] Oct 11 10:51:57.303215 master-0 kubenswrapper[4790]: I1011 10:51:57.303188 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:51:57.323892 master-0 kubenswrapper[4790]: I1011 10:51:57.317874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21c0c53d-d3a0-45bf-84b3-930269d44522-cert\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.323892 master-0 kubenswrapper[4790]: I1011 10:51:57.317948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntnqz\" (UniqueName: \"kubernetes.io/projected/5409d60b-bd64-4ef3-9531-b264971d7d85-kube-api-access-ntnqz\") pod \"nova-operator-controller-manager-64487ccd4d-fzt8d\" (UID: \"5409d60b-bd64-4ef3-9531-b264971d7d85\") " pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:51:57.323892 master-0 kubenswrapper[4790]: I1011 10:51:57.317978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmbgk\" (UniqueName: \"kubernetes.io/projected/21c0c53d-d3a0-45bf-84b3-930269d44522-kube-api-access-bmbgk\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.325573 master-0 kubenswrapper[4790]: I1011 10:51:57.325518 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc"] Oct 11 10:51:57.384351 master-0 kubenswrapper[4790]: I1011 10:51:57.384291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntnqz\" (UniqueName: \"kubernetes.io/projected/5409d60b-bd64-4ef3-9531-b264971d7d85-kube-api-access-ntnqz\") pod \"nova-operator-controller-manager-64487ccd4d-fzt8d\" (UID: \"5409d60b-bd64-4ef3-9531-b264971d7d85\") " pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:51:57.418776 master-0 kubenswrapper[4790]: I1011 10:51:57.418682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21c0c53d-d3a0-45bf-84b3-930269d44522-cert\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.418776 master-0 kubenswrapper[4790]: I1011 10:51:57.418761 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrt2d\" (UniqueName: \"kubernetes.io/projected/b135d684-0aee-445a-8c2b-a5c5a656b626-kube-api-access-lrt2d\") pod \"ovn-operator-controller-manager-f9dd6d5b6-qt8lg\" (UID: \"b135d684-0aee-445a-8c2b-a5c5a656b626\") " pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:51:57.419089 master-0 kubenswrapper[4790]: I1011 10:51:57.418817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmbgk\" (UniqueName: \"kubernetes.io/projected/21c0c53d-d3a0-45bf-84b3-930269d44522-kube-api-access-bmbgk\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.419089 master-0 kubenswrapper[4790]: E1011 10:51:57.418991 4790 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 11 10:51:57.419089 master-0 kubenswrapper[4790]: I1011 10:51:57.419034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8lkw\" (UniqueName: \"kubernetes.io/projected/6c42b87f-92a0-4250-bc3a-7b117dcf8df8-kube-api-access-x8lkw\") pod \"placement-operator-controller-manager-569c9576c5-wpgbc\" (UID: \"6c42b87f-92a0-4250-bc3a-7b117dcf8df8\") " pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:51:57.419210 master-0 kubenswrapper[4790]: E1011 10:51:57.419098 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21c0c53d-d3a0-45bf-84b3-930269d44522-cert podName:21c0c53d-d3a0-45bf-84b3-930269d44522 nodeName:}" failed. No retries permitted until 2025-10-11 10:51:57.919072548 +0000 UTC m=+794.473532930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21c0c53d-d3a0-45bf-84b3-930269d44522-cert") pod "openstack-baremetal-operator-controller-manager-78696cb447sdltf" (UID: "21c0c53d-d3a0-45bf-84b3-930269d44522") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 11 10:51:57.465349 master-0 kubenswrapper[4790]: I1011 10:51:57.465259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmbgk\" (UniqueName: \"kubernetes.io/projected/21c0c53d-d3a0-45bf-84b3-930269d44522-kube-api-access-bmbgk\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.511455 master-0 kubenswrapper[4790]: I1011 10:51:57.511373 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:51:57.524297 master-0 kubenswrapper[4790]: I1011 10:51:57.524233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8lkw\" (UniqueName: \"kubernetes.io/projected/6c42b87f-92a0-4250-bc3a-7b117dcf8df8-kube-api-access-x8lkw\") pod \"placement-operator-controller-manager-569c9576c5-wpgbc\" (UID: \"6c42b87f-92a0-4250-bc3a-7b117dcf8df8\") " pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:51:57.524805 master-0 kubenswrapper[4790]: I1011 10:51:57.524769 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrt2d\" (UniqueName: \"kubernetes.io/projected/b135d684-0aee-445a-8c2b-a5c5a656b626-kube-api-access-lrt2d\") pod \"ovn-operator-controller-manager-f9dd6d5b6-qt8lg\" (UID: \"b135d684-0aee-445a-8c2b-a5c5a656b626\") " pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:51:57.533064 master-0 kubenswrapper[4790]: I1011 10:51:57.532943 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm"] Oct 11 10:51:57.550946 master-0 kubenswrapper[4790]: I1011 10:51:57.550899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrt2d\" (UniqueName: \"kubernetes.io/projected/b135d684-0aee-445a-8c2b-a5c5a656b626-kube-api-access-lrt2d\") pod \"ovn-operator-controller-manager-f9dd6d5b6-qt8lg\" (UID: \"b135d684-0aee-445a-8c2b-a5c5a656b626\") " pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:51:57.556906 master-0 kubenswrapper[4790]: I1011 10:51:57.556864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8lkw\" (UniqueName: \"kubernetes.io/projected/6c42b87f-92a0-4250-bc3a-7b117dcf8df8-kube-api-access-x8lkw\") pod \"placement-operator-controller-manager-569c9576c5-wpgbc\" (UID: \"6c42b87f-92a0-4250-bc3a-7b117dcf8df8\") " pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:51:57.571450 master-0 kubenswrapper[4790]: I1011 10:51:57.571385 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:51:57.587582 master-0 kubenswrapper[4790]: I1011 10:51:57.587156 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:51:57.628257 master-0 kubenswrapper[4790]: I1011 10:51:57.626803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-cert\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.634585 master-0 kubenswrapper[4790]: I1011 10:51:57.632504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19f1cdae-2bd6-42f2-aedc-7da343eeab3f-cert\") pod \"infra-operator-controller-manager-d68fd5cdf-2dkw2\" (UID: \"19f1cdae-2bd6-42f2-aedc-7da343eeab3f\") " pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.642283 master-0 kubenswrapper[4790]: I1011 10:51:57.642237 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:51:57.654360 master-0 kubenswrapper[4790]: I1011 10:51:57.654297 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms"] Oct 11 10:51:57.657878 master-0 kubenswrapper[4790]: I1011 10:51:57.657113 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:57.669361 master-0 kubenswrapper[4790]: I1011 10:51:57.669097 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 11 10:51:57.680740 master-0 kubenswrapper[4790]: I1011 10:51:57.680664 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms"] Oct 11 10:51:57.779156 master-0 kubenswrapper[4790]: I1011 10:51:57.778132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" event={"ID":"f74275f9-5962-46ed-bbf9-9ca7dabea845","Type":"ContainerStarted","Data":"0b4365c7dc2ee855855b8ac105a5b9668bebabbf177ea6828f276517bdfc93db"} Oct 11 10:51:57.834861 master-0 kubenswrapper[4790]: I1011 10:51:57.834799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:57.834998 master-0 kubenswrapper[4790]: I1011 10:51:57.834972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhbq\" (UniqueName: \"kubernetes.io/projected/eaac04d2-f217-437a-b0db-9cc23f0373d9-kube-api-access-xkhbq\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:57.848350 master-0 kubenswrapper[4790]: I1011 10:51:57.848302 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:51:57.937124 master-0 kubenswrapper[4790]: I1011 10:51:57.935985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhbq\" (UniqueName: \"kubernetes.io/projected/eaac04d2-f217-437a-b0db-9cc23f0373d9-kube-api-access-xkhbq\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:57.937124 master-0 kubenswrapper[4790]: I1011 10:51:57.936050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21c0c53d-d3a0-45bf-84b3-930269d44522-cert\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.937124 master-0 kubenswrapper[4790]: I1011 10:51:57.936073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:57.937124 master-0 kubenswrapper[4790]: E1011 10:51:57.936283 4790 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 11 10:51:57.937124 master-0 kubenswrapper[4790]: E1011 10:51:57.936369 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert podName:eaac04d2-f217-437a-b0db-9cc23f0373d9 nodeName:}" failed. No retries permitted until 2025-10-11 10:51:58.436349116 +0000 UTC m=+794.990809408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert") pod "openstack-operator-controller-manager-6df4464d49-mxsms" (UID: "eaac04d2-f217-437a-b0db-9cc23f0373d9") : secret "webhook-server-cert" not found Oct 11 10:51:57.940756 master-0 kubenswrapper[4790]: I1011 10:51:57.940459 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21c0c53d-d3a0-45bf-84b3-930269d44522-cert\") pod \"openstack-baremetal-operator-controller-manager-78696cb447sdltf\" (UID: \"21c0c53d-d3a0-45bf-84b3-930269d44522\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:57.961652 master-0 kubenswrapper[4790]: I1011 10:51:57.961281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhbq\" (UniqueName: \"kubernetes.io/projected/eaac04d2-f217-437a-b0db-9cc23f0373d9-kube-api-access-xkhbq\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:58.035421 master-0 kubenswrapper[4790]: I1011 10:51:58.035315 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d"] Oct 11 10:51:58.041881 master-0 kubenswrapper[4790]: W1011 10:51:58.041803 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5409d60b_bd64_4ef3_9531_b264971d7d85.slice/crio-ff5b3345190b56922afd299bda24109abb663fde8b39b786a9be7d57ab57b654 WatchSource:0}: Error finding container ff5b3345190b56922afd299bda24109abb663fde8b39b786a9be7d57ab57b654: Status 404 returned error can't find the container with id ff5b3345190b56922afd299bda24109abb663fde8b39b786a9be7d57ab57b654 Oct 11 10:51:58.130617 master-0 kubenswrapper[4790]: I1011 10:51:58.130563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:51:58.141817 master-0 kubenswrapper[4790]: I1011 10:51:58.141762 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc"] Oct 11 10:51:58.145668 master-0 kubenswrapper[4790]: I1011 10:51:58.145645 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg"] Oct 11 10:51:58.145939 master-0 kubenswrapper[4790]: W1011 10:51:58.145914 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb135d684_0aee_445a_8c2b_a5c5a656b626.slice/crio-bf58d2dedb74858dd134dc115b577a38ad6f4937ef9c552587299a28ded00d4d WatchSource:0}: Error finding container bf58d2dedb74858dd134dc115b577a38ad6f4937ef9c552587299a28ded00d4d: Status 404 returned error can't find the container with id bf58d2dedb74858dd134dc115b577a38ad6f4937ef9c552587299a28ded00d4d Oct 11 10:51:58.162587 master-0 kubenswrapper[4790]: I1011 10:51:58.162517 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2"] Oct 11 10:51:58.174304 master-0 kubenswrapper[4790]: W1011 10:51:58.174227 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c307dd6_17af_4fc5_8b19_d6fd59f46d04.slice/crio-28f5797f04e0fe592135e1a4cf930f9905c1390e10d1402225e7ae5e4a602f70 WatchSource:0}: Error finding container 28f5797f04e0fe592135e1a4cf930f9905c1390e10d1402225e7ae5e4a602f70: Status 404 returned error can't find the container with id 28f5797f04e0fe592135e1a4cf930f9905c1390e10d1402225e7ae5e4a602f70 Oct 11 10:51:58.341934 master-0 kubenswrapper[4790]: I1011 10:51:58.341819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2"] Oct 11 10:51:58.351095 master-0 kubenswrapper[4790]: W1011 10:51:58.351029 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f1cdae_2bd6_42f2_aedc_7da343eeab3f.slice/crio-4266f1178d515a46f7ab4c097c954e43465d0e1f07c76e2b362f1350921b0b49 WatchSource:0}: Error finding container 4266f1178d515a46f7ab4c097c954e43465d0e1f07c76e2b362f1350921b0b49: Status 404 returned error can't find the container with id 4266f1178d515a46f7ab4c097c954e43465d0e1f07c76e2b362f1350921b0b49 Oct 11 10:51:58.443855 master-0 kubenswrapper[4790]: I1011 10:51:58.443768 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:58.444176 master-0 kubenswrapper[4790]: E1011 10:51:58.444026 4790 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 11 10:51:58.444176 master-0 kubenswrapper[4790]: E1011 10:51:58.444144 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert podName:eaac04d2-f217-437a-b0db-9cc23f0373d9 nodeName:}" failed. No retries permitted until 2025-10-11 10:51:59.444121505 +0000 UTC m=+795.998581807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert") pod "openstack-operator-controller-manager-6df4464d49-mxsms" (UID: "eaac04d2-f217-437a-b0db-9cc23f0373d9") : secret "webhook-server-cert" not found Oct 11 10:51:58.598182 master-0 kubenswrapper[4790]: I1011 10:51:58.597142 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf"] Oct 11 10:51:58.605214 master-0 kubenswrapper[4790]: W1011 10:51:58.605155 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c0c53d_d3a0_45bf_84b3_930269d44522.slice/crio-1a5deb5d4077f2ddea7f4af077eb7bf664c066c6489b47e900a513d1b1aa7168 WatchSource:0}: Error finding container 1a5deb5d4077f2ddea7f4af077eb7bf664c066c6489b47e900a513d1b1aa7168: Status 404 returned error can't find the container with id 1a5deb5d4077f2ddea7f4af077eb7bf664c066c6489b47e900a513d1b1aa7168 Oct 11 10:51:58.798097 master-0 kubenswrapper[4790]: I1011 10:51:58.798023 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" event={"ID":"19f1cdae-2bd6-42f2-aedc-7da343eeab3f","Type":"ContainerStarted","Data":"4266f1178d515a46f7ab4c097c954e43465d0e1f07c76e2b362f1350921b0b49"} Oct 11 10:51:58.800003 master-0 kubenswrapper[4790]: I1011 10:51:58.799926 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" event={"ID":"7c307dd6-17af-4fc5-8b19-d6fd59f46d04","Type":"ContainerStarted","Data":"28f5797f04e0fe592135e1a4cf930f9905c1390e10d1402225e7ae5e4a602f70"} Oct 11 10:51:58.801277 master-0 kubenswrapper[4790]: I1011 10:51:58.801239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" event={"ID":"6c42b87f-92a0-4250-bc3a-7b117dcf8df8","Type":"ContainerStarted","Data":"e5a4e321cec755b4c9d6d97797b6ea153af34773a75fba9c871f395aa01b258f"} Oct 11 10:51:58.803088 master-0 kubenswrapper[4790]: I1011 10:51:58.803052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" event={"ID":"21c0c53d-d3a0-45bf-84b3-930269d44522","Type":"ContainerStarted","Data":"1a5deb5d4077f2ddea7f4af077eb7bf664c066c6489b47e900a513d1b1aa7168"} Oct 11 10:51:58.805231 master-0 kubenswrapper[4790]: I1011 10:51:58.805192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" event={"ID":"5409d60b-bd64-4ef3-9531-b264971d7d85","Type":"ContainerStarted","Data":"ff5b3345190b56922afd299bda24109abb663fde8b39b786a9be7d57ab57b654"} Oct 11 10:51:58.806295 master-0 kubenswrapper[4790]: I1011 10:51:58.806254 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" event={"ID":"b135d684-0aee-445a-8c2b-a5c5a656b626","Type":"ContainerStarted","Data":"bf58d2dedb74858dd134dc115b577a38ad6f4937ef9c552587299a28ded00d4d"} Oct 11 10:51:59.461787 master-0 kubenswrapper[4790]: I1011 10:51:59.461132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:59.467854 master-0 kubenswrapper[4790]: I1011 10:51:59.467812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaac04d2-f217-437a-b0db-9cc23f0373d9-cert\") pod \"openstack-operator-controller-manager-6df4464d49-mxsms\" (UID: \"eaac04d2-f217-437a-b0db-9cc23f0373d9\") " pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:59.508366 master-0 kubenswrapper[4790]: I1011 10:51:59.508316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:51:59.829913 master-0 kubenswrapper[4790]: I1011 10:51:59.819020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" event={"ID":"f74275f9-5962-46ed-bbf9-9ca7dabea845","Type":"ContainerStarted","Data":"9056e918687243915cd44a8c7aa8b91c8d48787e2c6f2f16ce172469ec0791d7"} Oct 11 10:51:59.931826 master-0 kubenswrapper[4790]: I1011 10:51:59.931753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms"] Oct 11 10:52:00.313214 master-0 kubenswrapper[4790]: W1011 10:52:00.313162 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaac04d2_f217_437a_b0db_9cc23f0373d9.slice/crio-2e8313d5669b88da027d018d458dddc0a47010415bd8b882cbecbed62e368acc WatchSource:0}: Error finding container 2e8313d5669b88da027d018d458dddc0a47010415bd8b882cbecbed62e368acc: Status 404 returned error can't find the container with id 2e8313d5669b88da027d018d458dddc0a47010415bd8b882cbecbed62e368acc Oct 11 10:52:00.828045 master-0 kubenswrapper[4790]: I1011 10:52:00.827964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" event={"ID":"f74275f9-5962-46ed-bbf9-9ca7dabea845","Type":"ContainerStarted","Data":"c469b1f9624dc83431ecfa1097a646cc5204df163b5e9600f542f9184e330157"} Oct 11 10:52:00.828244 master-0 kubenswrapper[4790]: I1011 10:52:00.828076 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:52:00.829269 master-0 kubenswrapper[4790]: I1011 10:52:00.829215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" event={"ID":"eaac04d2-f217-437a-b0db-9cc23f0373d9","Type":"ContainerStarted","Data":"2e8313d5669b88da027d018d458dddc0a47010415bd8b882cbecbed62e368acc"} Oct 11 10:52:02.845238 master-0 kubenswrapper[4790]: I1011 10:52:02.845174 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" event={"ID":"b135d684-0aee-445a-8c2b-a5c5a656b626","Type":"ContainerStarted","Data":"f640e13f11de91b617029d2fdd39a8a326a773b2d9e02b6521eb9a33a08101e0"} Oct 11 10:52:02.853318 master-0 kubenswrapper[4790]: I1011 10:52:02.847006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" event={"ID":"5409d60b-bd64-4ef3-9531-b264971d7d85","Type":"ContainerStarted","Data":"72acd123369e5a79e158e1c77d131a1cd9069735e8e1d24a9b1885d47062cd04"} Oct 11 10:52:02.853318 master-0 kubenswrapper[4790]: I1011 10:52:02.848693 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" event={"ID":"19f1cdae-2bd6-42f2-aedc-7da343eeab3f","Type":"ContainerStarted","Data":"c99a97fb3d1e339f0f59a39f0083a7b1d3f32414e607e3d9b2728bda5d2ba691"} Oct 11 10:52:02.853318 master-0 kubenswrapper[4790]: I1011 10:52:02.851198 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" event={"ID":"7c307dd6-17af-4fc5-8b19-d6fd59f46d04","Type":"ContainerStarted","Data":"da4c04407ccb096e6ac194d0124c5080b7ca07594b2e2ddf939bdcf0e0944300"} Oct 11 10:52:02.853318 master-0 kubenswrapper[4790]: I1011 10:52:02.852621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" event={"ID":"6c42b87f-92a0-4250-bc3a-7b117dcf8df8","Type":"ContainerStarted","Data":"fb1028b4755566b85d715830c2ee0708fac025f310affaf0d13092f92979fc82"} Oct 11 10:52:02.856981 master-0 kubenswrapper[4790]: I1011 10:52:02.855733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" event={"ID":"21c0c53d-d3a0-45bf-84b3-930269d44522","Type":"ContainerStarted","Data":"1eca6c3bb22c908f052c0c1290370df1fbae05682e137ba7ac1007b2ab888b91"} Oct 11 10:52:02.861721 master-0 kubenswrapper[4790]: I1011 10:52:02.861670 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" event={"ID":"eaac04d2-f217-437a-b0db-9cc23f0373d9","Type":"ContainerStarted","Data":"4098ac39db0ed59946f155cac604eb1993fdc5e9fb5898ed87c461be5152fe8b"} Oct 11 10:52:03.871558 master-0 kubenswrapper[4790]: I1011 10:52:03.871453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" event={"ID":"b135d684-0aee-445a-8c2b-a5c5a656b626","Type":"ContainerStarted","Data":"ea4d6846b635e4229cce4b63a087d15e792a98486ab324ff333bda09ee17ab5f"} Oct 11 10:52:03.872325 master-0 kubenswrapper[4790]: I1011 10:52:03.871600 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:52:03.874185 master-0 kubenswrapper[4790]: I1011 10:52:03.874140 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" event={"ID":"5409d60b-bd64-4ef3-9531-b264971d7d85","Type":"ContainerStarted","Data":"b2cfb028b6aefa96d5a7c700e6836a3c4e642e9f52c7f1431cc462732eb89820"} Oct 11 10:52:03.874469 master-0 kubenswrapper[4790]: I1011 10:52:03.874425 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:52:03.876955 master-0 kubenswrapper[4790]: I1011 10:52:03.876904 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" event={"ID":"19f1cdae-2bd6-42f2-aedc-7da343eeab3f","Type":"ContainerStarted","Data":"94abc5eed96ab9d66944fa23b51cfccbd88ca3cce68e2ac777cd669eaa9926dd"} Oct 11 10:52:03.877278 master-0 kubenswrapper[4790]: I1011 10:52:03.877233 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:52:03.879179 master-0 kubenswrapper[4790]: I1011 10:52:03.879137 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" event={"ID":"7c307dd6-17af-4fc5-8b19-d6fd59f46d04","Type":"ContainerStarted","Data":"2a4ea12348174660f930fd9dc8f8c5cc45a22a2f4c64269e436c99f6ad787756"} Oct 11 10:52:03.879219 master-0 kubenswrapper[4790]: I1011 10:52:03.879198 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:52:03.881199 master-0 kubenswrapper[4790]: I1011 10:52:03.881145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" event={"ID":"6c42b87f-92a0-4250-bc3a-7b117dcf8df8","Type":"ContainerStarted","Data":"c69aa721dd66e9ebfa9acaa641972414223791d47232a85e5e781a80433ff900"} Oct 11 10:52:03.881408 master-0 kubenswrapper[4790]: I1011 10:52:03.881364 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:52:03.882651 master-0 kubenswrapper[4790]: I1011 10:52:03.882619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" event={"ID":"21c0c53d-d3a0-45bf-84b3-930269d44522","Type":"ContainerStarted","Data":"eba0f894775776451abe26f5f38d30403da2394c7afe40c350b6ecc7a0ad2a29"} Oct 11 10:52:03.882754 master-0 kubenswrapper[4790]: I1011 10:52:03.882733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:52:03.884395 master-0 kubenswrapper[4790]: I1011 10:52:03.884361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" event={"ID":"eaac04d2-f217-437a-b0db-9cc23f0373d9","Type":"ContainerStarted","Data":"496616631f6edf2b7361233334dcbd10923471c94500daceacfb0a34e5f3f347"} Oct 11 10:52:03.884604 master-0 kubenswrapper[4790]: I1011 10:52:03.884565 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:52:03.900159 master-0 kubenswrapper[4790]: I1011 10:52:03.900070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" podStartSLOduration=6.036135212 podStartE2EDuration="7.900050647s" podCreationTimestamp="2025-10-11 10:51:56 +0000 UTC" firstStartedPulling="2025-10-11 10:51:57.541466688 +0000 UTC m=+794.095926980" lastFinishedPulling="2025-10-11 10:51:59.405382133 +0000 UTC m=+795.959842415" observedRunningTime="2025-10-11 10:52:00.850234023 +0000 UTC m=+797.404694325" watchObservedRunningTime="2025-10-11 10:52:03.900050647 +0000 UTC m=+800.454510969" Oct 11 10:52:03.904312 master-0 kubenswrapper[4790]: I1011 10:52:03.904240 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" podStartSLOduration=2.585296461 podStartE2EDuration="6.9042285s" podCreationTimestamp="2025-10-11 10:51:57 +0000 UTC" firstStartedPulling="2025-10-11 10:51:58.169849698 +0000 UTC m=+794.724309990" lastFinishedPulling="2025-10-11 10:52:02.488781727 +0000 UTC m=+799.043242029" observedRunningTime="2025-10-11 10:52:03.897420946 +0000 UTC m=+800.451881268" watchObservedRunningTime="2025-10-11 10:52:03.9042285 +0000 UTC m=+800.458688822" Oct 11 10:52:03.934733 master-0 kubenswrapper[4790]: I1011 10:52:03.931095 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" podStartSLOduration=3.642513883 podStartE2EDuration="7.931059818s" podCreationTimestamp="2025-10-11 10:51:56 +0000 UTC" firstStartedPulling="2025-10-11 10:51:58.197180649 +0000 UTC m=+794.751640941" lastFinishedPulling="2025-10-11 10:52:02.485726564 +0000 UTC m=+799.040186876" observedRunningTime="2025-10-11 10:52:03.922391083 +0000 UTC m=+800.476851395" watchObservedRunningTime="2025-10-11 10:52:03.931059818 +0000 UTC m=+800.485520120" Oct 11 10:52:03.966741 master-0 kubenswrapper[4790]: I1011 10:52:03.966420 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" podStartSLOduration=6.966393647 podStartE2EDuration="6.966393647s" podCreationTimestamp="2025-10-11 10:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:52:03.962431709 +0000 UTC m=+800.516892011" watchObservedRunningTime="2025-10-11 10:52:03.966393647 +0000 UTC m=+800.520853939" Oct 11 10:52:04.018588 master-0 kubenswrapper[4790]: I1011 10:52:04.018325 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" podStartSLOduration=2.576805312 podStartE2EDuration="7.018298004s" podCreationTimestamp="2025-10-11 10:51:57 +0000 UTC" firstStartedPulling="2025-10-11 10:51:58.044747876 +0000 UTC m=+794.599208168" lastFinishedPulling="2025-10-11 10:52:02.486240558 +0000 UTC m=+799.040700860" observedRunningTime="2025-10-11 10:52:03.987239052 +0000 UTC m=+800.541699374" watchObservedRunningTime="2025-10-11 10:52:04.018298004 +0000 UTC m=+800.572758296" Oct 11 10:52:04.018849 master-0 kubenswrapper[4790]: I1011 10:52:04.018656 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" podStartSLOduration=3.884591137 podStartE2EDuration="8.018651383s" podCreationTimestamp="2025-10-11 10:51:56 +0000 UTC" firstStartedPulling="2025-10-11 10:51:58.354815164 +0000 UTC m=+794.909275456" lastFinishedPulling="2025-10-11 10:52:02.48887539 +0000 UTC m=+799.043335702" observedRunningTime="2025-10-11 10:52:04.016158196 +0000 UTC m=+800.570618518" watchObservedRunningTime="2025-10-11 10:52:04.018651383 +0000 UTC m=+800.573111675" Oct 11 10:52:04.059576 master-0 kubenswrapper[4790]: I1011 10:52:04.059485 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" podStartSLOduration=3.181059477 podStartE2EDuration="7.05945875s" podCreationTimestamp="2025-10-11 10:51:57 +0000 UTC" firstStartedPulling="2025-10-11 10:51:58.608053981 +0000 UTC m=+795.162514313" lastFinishedPulling="2025-10-11 10:52:02.486453254 +0000 UTC m=+799.040913586" observedRunningTime="2025-10-11 10:52:04.054512856 +0000 UTC m=+800.608973148" watchObservedRunningTime="2025-10-11 10:52:04.05945875 +0000 UTC m=+800.613919032" Oct 11 10:52:07.084376 master-0 kubenswrapper[4790]: I1011 10:52:07.084275 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658c7b459c-fzlrm" Oct 11 10:52:07.114821 master-0 kubenswrapper[4790]: I1011 10:52:07.114482 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" podStartSLOduration=5.797994332 podStartE2EDuration="10.114444574s" podCreationTimestamp="2025-10-11 10:51:57 +0000 UTC" firstStartedPulling="2025-10-11 10:51:58.169795016 +0000 UTC m=+794.724255308" lastFinishedPulling="2025-10-11 10:52:02.486245218 +0000 UTC m=+799.040705550" observedRunningTime="2025-10-11 10:52:04.081502718 +0000 UTC m=+800.635963010" watchObservedRunningTime="2025-10-11 10:52:07.114444574 +0000 UTC m=+803.668904906" Oct 11 10:52:07.515773 master-0 kubenswrapper[4790]: I1011 10:52:07.515559 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64487ccd4d-fzt8d" Oct 11 10:52:07.577199 master-0 kubenswrapper[4790]: I1011 10:52:07.576257 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54969ff695-mxpp2" Oct 11 10:52:07.590866 master-0 kubenswrapper[4790]: I1011 10:52:07.590519 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-f9dd6d5b6-qt8lg" Oct 11 10:52:07.648038 master-0 kubenswrapper[4790]: I1011 10:52:07.647941 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-569c9576c5-wpgbc" Oct 11 10:52:07.856689 master-0 kubenswrapper[4790]: I1011 10:52:07.856626 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d68fd5cdf-2dkw2" Oct 11 10:52:08.139667 master-0 kubenswrapper[4790]: I1011 10:52:08.139443 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78696cb447sdltf" Oct 11 10:52:09.518065 master-0 kubenswrapper[4790]: I1011 10:52:09.517883 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6df4464d49-mxsms" Oct 11 10:52:48.221108 master-0 kubenswrapper[4790]: I1011 10:52:48.221028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6944757b7f-plhvq"] Oct 11 10:52:48.224604 master-0 kubenswrapper[4790]: I1011 10:52:48.224555 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.228527 master-0 kubenswrapper[4790]: I1011 10:52:48.228012 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:52:48.228527 master-0 kubenswrapper[4790]: I1011 10:52:48.228278 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:52:48.228527 master-0 kubenswrapper[4790]: I1011 10:52:48.228430 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 11 10:52:48.228887 master-0 kubenswrapper[4790]: I1011 10:52:48.228819 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 11 10:52:48.245136 master-0 kubenswrapper[4790]: I1011 10:52:48.245061 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6944757b7f-plhvq"] Oct 11 10:52:48.369819 master-0 kubenswrapper[4790]: I1011 10:52:48.367931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-dns-svc\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.370153 master-0 kubenswrapper[4790]: I1011 10:52:48.370050 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-config\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.370153 master-0 kubenswrapper[4790]: I1011 10:52:48.370104 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wx2k\" (UniqueName: \"kubernetes.io/projected/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-kube-api-access-7wx2k\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.471935 master-0 kubenswrapper[4790]: I1011 10:52:48.471852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-dns-svc\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.471935 master-0 kubenswrapper[4790]: I1011 10:52:48.471922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-config\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.471935 master-0 kubenswrapper[4790]: I1011 10:52:48.471959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wx2k\" (UniqueName: \"kubernetes.io/projected/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-kube-api-access-7wx2k\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.472965 master-0 kubenswrapper[4790]: I1011 10:52:48.472908 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-dns-svc\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.473470 master-0 kubenswrapper[4790]: I1011 10:52:48.473421 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-config\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.498448 master-0 kubenswrapper[4790]: I1011 10:52:48.498280 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wx2k\" (UniqueName: \"kubernetes.io/projected/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-kube-api-access-7wx2k\") pod \"dnsmasq-dns-6944757b7f-plhvq\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:48.538831 master-0 kubenswrapper[4790]: I1011 10:52:48.538337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:52:49.007346 master-0 kubenswrapper[4790]: I1011 10:52:49.007165 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6944757b7f-plhvq"] Oct 11 10:52:49.019975 master-0 kubenswrapper[4790]: I1011 10:52:49.019896 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 10:52:49.273596 master-0 kubenswrapper[4790]: I1011 10:52:49.273372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" event={"ID":"f97fbf89-0d03-4ed8-a0d2-4f796e705e20","Type":"ContainerStarted","Data":"a4a9276558748dc6921cea20229a43f4acf57679858b58aec74901d23d4a131c"} Oct 11 10:52:57.862754 master-0 kubenswrapper[4790]: I1011 10:52:57.862652 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Oct 11 10:52:57.864661 master-0 kubenswrapper[4790]: I1011 10:52:57.864627 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.880385 master-0 kubenswrapper[4790]: I1011 10:52:57.880329 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 11 10:52:57.880732 master-0 kubenswrapper[4790]: I1011 10:52:57.880400 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 11 10:52:57.880732 master-0 kubenswrapper[4790]: I1011 10:52:57.880557 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 11 10:52:57.880732 master-0 kubenswrapper[4790]: I1011 10:52:57.880657 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 11 10:52:57.882496 master-0 kubenswrapper[4790]: I1011 10:52:57.882179 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 11 10:52:57.896546 master-0 kubenswrapper[4790]: I1011 10:52:57.883564 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 11 10:52:57.900909 master-0 kubenswrapper[4790]: I1011 10:52:57.900841 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Oct 11 10:52:57.919977 master-0 kubenswrapper[4790]: I1011 10:52:57.919820 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-1"] Oct 11 10:52:57.922534 master-0 kubenswrapper[4790]: I1011 10:52:57.922502 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.927122 master-0 kubenswrapper[4790]: I1011 10:52:57.926999 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 11 10:52:57.927441 master-0 kubenswrapper[4790]: I1011 10:52:57.927152 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 11 10:52:57.928386 master-0 kubenswrapper[4790]: I1011 10:52:57.927694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.941207 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-1"] Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-server-conf\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktw9d\" (UniqueName: \"kubernetes.io/projected/1ebbbaaf-f668-4c40-b437-2e730aef3912-kube-api-access-ktw9d\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ebbbaaf-f668-4c40-b437-2e730aef3912-config-volume\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945248 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/1ebbbaaf-f668-4c40-b437-2e730aef3912-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945268 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kvmh\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-kube-api-access-2kvmh\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ebbbaaf-f668-4c40-b437-2e730aef3912-config-out\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ebbbaaf-f668-4c40-b437-2e730aef3912-tls-assets\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945359 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945380 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ebbbaaf-f668-4c40-b437-2e730aef3912-web-config\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-config-data\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-91b83a1d-eeb3-4510-8d45-14992a484dba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c7f44e-079f-46cd-a04e-fcf37ad4dbd2\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946085 master-0 kubenswrapper[4790]: I1011 10:52:57.945464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c13cb0d1-c50f-44fa-824a-46ece423a7cc-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:57.946699 master-0 kubenswrapper[4790]: I1011 10:52:57.945483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c13cb0d1-c50f-44fa-824a-46ece423a7cc-pod-info\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.050686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/1ebbbaaf-f668-4c40-b437-2e730aef3912-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.050818 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.050852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kvmh\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-kube-api-access-2kvmh\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ebbbaaf-f668-4c40-b437-2e730aef3912-config-out\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ebbbaaf-f668-4c40-b437-2e730aef3912-tls-assets\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ebbbaaf-f668-4c40-b437-2e730aef3912-web-config\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/1ebbbaaf-f668-4c40-b437-2e730aef3912-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051278 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-config-data\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.051737 master-0 kubenswrapper[4790]: I1011 10:52:58.051756 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-91b83a1d-eeb3-4510-8d45-14992a484dba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c7f44e-079f-46cd-a04e-fcf37ad4dbd2\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.052457 master-0 kubenswrapper[4790]: I1011 10:52:58.051795 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c13cb0d1-c50f-44fa-824a-46ece423a7cc-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.052457 master-0 kubenswrapper[4790]: I1011 10:52:58.051837 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c13cb0d1-c50f-44fa-824a-46ece423a7cc-pod-info\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.052457 master-0 kubenswrapper[4790]: I1011 10:52:58.051873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-server-conf\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.052457 master-0 kubenswrapper[4790]: I1011 10:52:58.051927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.052457 master-0 kubenswrapper[4790]: I1011 10:52:58.051982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktw9d\" (UniqueName: \"kubernetes.io/projected/1ebbbaaf-f668-4c40-b437-2e730aef3912-kube-api-access-ktw9d\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.052457 master-0 kubenswrapper[4790]: I1011 10:52:58.052062 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ebbbaaf-f668-4c40-b437-2e730aef3912-config-volume\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.068739 master-0 kubenswrapper[4790]: I1011 10:52:58.064048 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.086734 master-0 kubenswrapper[4790]: I1011 10:52:58.079988 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ebbbaaf-f668-4c40-b437-2e730aef3912-tls-assets\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.086734 master-0 kubenswrapper[4790]: I1011 10:52:58.083163 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:52:58.086734 master-0 kubenswrapper[4790]: I1011 10:52:58.083224 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-91b83a1d-eeb3-4510-8d45-14992a484dba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c7f44e-079f-46cd-a04e-fcf37ad4dbd2\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/1114453ea83085dab1ca278aa52d732a8f54ac9ebf0fc42d5b564ce2eb10c0e8/globalmount\"" pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.086734 master-0 kubenswrapper[4790]: I1011 10:52:58.084244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.086734 master-0 kubenswrapper[4790]: I1011 10:52:58.085107 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-config-data\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.099735 master-0 kubenswrapper[4790]: I1011 10:52:58.094241 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ebbbaaf-f668-4c40-b437-2e730aef3912-config-out\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.099735 master-0 kubenswrapper[4790]: I1011 10:52:58.095692 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c13cb0d1-c50f-44fa-824a-46ece423a7cc-server-conf\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.099735 master-0 kubenswrapper[4790]: I1011 10:52:58.096544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kvmh\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-kube-api-access-2kvmh\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.099735 master-0 kubenswrapper[4790]: I1011 10:52:58.096997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.099735 master-0 kubenswrapper[4790]: I1011 10:52:58.097734 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.125261 master-0 kubenswrapper[4790]: I1011 10:52:58.101441 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c13cb0d1-c50f-44fa-824a-46ece423a7cc-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.130287 master-0 kubenswrapper[4790]: I1011 10:52:58.129190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c13cb0d1-c50f-44fa-824a-46ece423a7cc-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.130287 master-0 kubenswrapper[4790]: I1011 10:52:58.129790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ebbbaaf-f668-4c40-b437-2e730aef3912-web-config\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.151743 master-0 kubenswrapper[4790]: I1011 10:52:58.147845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c13cb0d1-c50f-44fa-824a-46ece423a7cc-pod-info\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:58.151743 master-0 kubenswrapper[4790]: I1011 10:52:58.149854 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktw9d\" (UniqueName: \"kubernetes.io/projected/1ebbbaaf-f668-4c40-b437-2e730aef3912-kube-api-access-ktw9d\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.184819 master-0 kubenswrapper[4790]: I1011 10:52:58.179369 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ebbbaaf-f668-4c40-b437-2e730aef3912-config-volume\") pod \"alertmanager-metric-storage-1\" (UID: \"1ebbbaaf-f668-4c40-b437-2e730aef3912\") " pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:58.252257 master-0 kubenswrapper[4790]: I1011 10:52:58.248599 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-1" Oct 11 10:52:59.677868 master-0 kubenswrapper[4790]: I1011 10:52:59.677786 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-91b83a1d-eeb3-4510-8d45-14992a484dba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c7f44e-079f-46cd-a04e-fcf37ad4dbd2\") pod \"rabbitmq-server-1\" (UID: \"c13cb0d1-c50f-44fa-824a-46ece423a7cc\") " pod="openstack/rabbitmq-server-1" Oct 11 10:52:59.704997 master-0 kubenswrapper[4790]: I1011 10:52:59.704935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Oct 11 10:53:01.092282 master-0 kubenswrapper[4790]: I1011 10:53:01.092185 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-52t2l"] Oct 11 10:53:01.101132 master-0 kubenswrapper[4790]: I1011 10:53:01.101068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.175872 master-0 kubenswrapper[4790]: I1011 10:53:01.175808 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-52t2l"] Oct 11 10:53:01.176542 master-0 kubenswrapper[4790]: I1011 10:53:01.176395 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 11 10:53:01.176542 master-0 kubenswrapper[4790]: I1011 10:53:01.176445 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 11 10:53:01.178906 master-0 kubenswrapper[4790]: I1011 10:53:01.177669 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 11 10:53:01.185059 master-0 kubenswrapper[4790]: I1011 10:53:01.179276 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dw8wx"] Oct 11 10:53:01.185059 master-0 kubenswrapper[4790]: I1011 10:53:01.180538 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.190661 master-0 kubenswrapper[4790]: I1011 10:53:01.190109 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dw8wx"] Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfjnz\" (UniqueName: \"kubernetes.io/projected/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-kube-api-access-hfjnz\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-run-ovn\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232847 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-log-ovn\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-combined-ca-bundle\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232930 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-ovn-controller-tls-certs\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-run\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.234010 master-0 kubenswrapper[4790]: I1011 10:53:01.232989 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-scripts\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-combined-ca-bundle\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334349 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-scripts\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-ovn-controller-tls-certs\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334426 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-run\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-scripts\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334474 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-lib\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-etc-ovs\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334527 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfjnz\" (UniqueName: \"kubernetes.io/projected/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-kube-api-access-hfjnz\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334567 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-run-ovn\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334591 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-run\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334609 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7v6\" (UniqueName: \"kubernetes.io/projected/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-kube-api-access-zl7v6\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-log-ovn\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.335429 master-0 kubenswrapper[4790]: I1011 10:53:01.334648 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-log\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.336447 master-0 kubenswrapper[4790]: I1011 10:53:01.335686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-log-ovn\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.336447 master-0 kubenswrapper[4790]: I1011 10:53:01.336004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-run-ovn\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.336447 master-0 kubenswrapper[4790]: I1011 10:53:01.336051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-var-run\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.338353 master-0 kubenswrapper[4790]: I1011 10:53:01.338295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-ovn-controller-tls-certs\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.344826 master-0 kubenswrapper[4790]: I1011 10:53:01.340441 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-scripts\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.344826 master-0 kubenswrapper[4790]: I1011 10:53:01.342043 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-combined-ca-bundle\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.356548 master-0 kubenswrapper[4790]: I1011 10:53:01.355652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfjnz\" (UniqueName: \"kubernetes.io/projected/8c164a4b-a2d5-4570-aed3-86dbb1f3d47c-kube-api-access-hfjnz\") pod \"ovn-controller-52t2l\" (UID: \"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c\") " pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.436744 master-0 kubenswrapper[4790]: I1011 10:53:01.436659 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-lib\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.436997 master-0 kubenswrapper[4790]: I1011 10:53:01.436782 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-etc-ovs\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437236 master-0 kubenswrapper[4790]: I1011 10:53:01.437199 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-run\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437236 master-0 kubenswrapper[4790]: I1011 10:53:01.437232 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7v6\" (UniqueName: \"kubernetes.io/projected/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-kube-api-access-zl7v6\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437336 master-0 kubenswrapper[4790]: I1011 10:53:01.437262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-log\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437507 master-0 kubenswrapper[4790]: I1011 10:53:01.437450 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-run\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437704 master-0 kubenswrapper[4790]: I1011 10:53:01.437636 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-etc-ovs\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437806 master-0 kubenswrapper[4790]: I1011 10:53:01.437649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-lib\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437806 master-0 kubenswrapper[4790]: I1011 10:53:01.437797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-var-log\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.437905 master-0 kubenswrapper[4790]: I1011 10:53:01.437681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-scripts\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.439950 master-0 kubenswrapper[4790]: I1011 10:53:01.439918 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-scripts\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.457795 master-0 kubenswrapper[4790]: I1011 10:53:01.457679 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-52t2l" Oct 11 10:53:01.469489 master-0 kubenswrapper[4790]: I1011 10:53:01.469421 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7v6\" (UniqueName: \"kubernetes.io/projected/c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0-kube-api-access-zl7v6\") pod \"ovn-controller-ovs-dw8wx\" (UID: \"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0\") " pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:01.514513 master-0 kubenswrapper[4790]: I1011 10:53:01.514452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:02.680348 master-0 kubenswrapper[4790]: I1011 10:53:02.678147 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Oct 11 10:53:02.792417 master-0 kubenswrapper[4790]: I1011 10:53:02.790094 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-52t2l"] Oct 11 10:53:02.805010 master-0 kubenswrapper[4790]: I1011 10:53:02.804342 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-1"] Oct 11 10:53:02.866462 master-0 kubenswrapper[4790]: I1011 10:53:02.865727 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-82rlt"] Oct 11 10:53:02.867112 master-0 kubenswrapper[4790]: I1011 10:53:02.867086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:02.872291 master-0 kubenswrapper[4790]: I1011 10:53:02.872234 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 11 10:53:02.875103 master-0 kubenswrapper[4790]: I1011 10:53:02.873880 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 11 10:53:02.914534 master-0 kubenswrapper[4790]: I1011 10:53:02.914449 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-82rlt"] Oct 11 10:53:02.966892 master-0 kubenswrapper[4790]: I1011 10:53:02.966844 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c572ba-fc98-4468-939a-bbe0eadb7b63-config\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:02.967004 master-0 kubenswrapper[4790]: I1011 10:53:02.966900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f4c572ba-fc98-4468-939a-bbe0eadb7b63-ovn-rundir\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:02.967004 master-0 kubenswrapper[4790]: I1011 10:53:02.966940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c572ba-fc98-4468-939a-bbe0eadb7b63-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:02.967004 master-0 kubenswrapper[4790]: I1011 10:53:02.966968 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbvjm\" (UniqueName: \"kubernetes.io/projected/f4c572ba-fc98-4468-939a-bbe0eadb7b63-kube-api-access-pbvjm\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:02.967105 master-0 kubenswrapper[4790]: I1011 10:53:02.967016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f4c572ba-fc98-4468-939a-bbe0eadb7b63-ovs-rundir\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:02.967105 master-0 kubenswrapper[4790]: I1011 10:53:02.967033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c572ba-fc98-4468-939a-bbe0eadb7b63-combined-ca-bundle\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.068400 master-0 kubenswrapper[4790]: I1011 10:53:03.068330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f4c572ba-fc98-4468-939a-bbe0eadb7b63-ovs-rundir\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.068400 master-0 kubenswrapper[4790]: I1011 10:53:03.068384 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c572ba-fc98-4468-939a-bbe0eadb7b63-combined-ca-bundle\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.068400 master-0 kubenswrapper[4790]: I1011 10:53:03.068416 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c572ba-fc98-4468-939a-bbe0eadb7b63-config\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.069057 master-0 kubenswrapper[4790]: I1011 10:53:03.068436 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f4c572ba-fc98-4468-939a-bbe0eadb7b63-ovn-rundir\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.069057 master-0 kubenswrapper[4790]: I1011 10:53:03.068468 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c572ba-fc98-4468-939a-bbe0eadb7b63-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.069057 master-0 kubenswrapper[4790]: I1011 10:53:03.068492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbvjm\" (UniqueName: \"kubernetes.io/projected/f4c572ba-fc98-4468-939a-bbe0eadb7b63-kube-api-access-pbvjm\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.069057 master-0 kubenswrapper[4790]: I1011 10:53:03.068939 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f4c572ba-fc98-4468-939a-bbe0eadb7b63-ovs-rundir\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.072005 master-0 kubenswrapper[4790]: I1011 10:53:03.069407 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f4c572ba-fc98-4468-939a-bbe0eadb7b63-ovn-rundir\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.072005 master-0 kubenswrapper[4790]: I1011 10:53:03.070283 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c572ba-fc98-4468-939a-bbe0eadb7b63-config\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.076929 master-0 kubenswrapper[4790]: I1011 10:53:03.076573 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c572ba-fc98-4468-939a-bbe0eadb7b63-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.077115 master-0 kubenswrapper[4790]: I1011 10:53:03.076846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c572ba-fc98-4468-939a-bbe0eadb7b63-combined-ca-bundle\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.094790 master-0 kubenswrapper[4790]: I1011 10:53:03.094673 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbvjm\" (UniqueName: \"kubernetes.io/projected/f4c572ba-fc98-4468-939a-bbe0eadb7b63-kube-api-access-pbvjm\") pod \"ovn-controller-metrics-82rlt\" (UID: \"f4c572ba-fc98-4468-939a-bbe0eadb7b63\") " pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.218058 master-0 kubenswrapper[4790]: I1011 10:53:03.217890 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-82rlt" Oct 11 10:53:03.346765 master-0 kubenswrapper[4790]: I1011 10:53:03.346204 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-1"] Oct 11 10:53:03.347534 master-0 kubenswrapper[4790]: I1011 10:53:03.347499 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.353339 master-0 kubenswrapper[4790]: I1011 10:53:03.353266 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 11 10:53:03.353566 master-0 kubenswrapper[4790]: I1011 10:53:03.353502 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 11 10:53:03.354686 master-0 kubenswrapper[4790]: I1011 10:53:03.354651 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 11 10:53:03.354931 master-0 kubenswrapper[4790]: I1011 10:53:03.354900 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 11 10:53:03.356628 master-0 kubenswrapper[4790]: I1011 10:53:03.355115 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 11 10:53:03.356628 master-0 kubenswrapper[4790]: I1011 10:53:03.356265 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 11 10:53:03.382502 master-0 kubenswrapper[4790]: I1011 10:53:03.381827 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-1"] Oct 11 10:53:03.475187 master-0 kubenswrapper[4790]: I1011 10:53:03.475055 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6d71dd31-2a9e-4c3d-bff4-ee1b201f04c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^62c55997-0912-4363-a44a-b273574b90ee\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475187 master-0 kubenswrapper[4790]: I1011 10:53:03.475132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475187 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be929908-6474-451d-8b87-e4effd7c6de4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475263 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475288 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be929908-6474-451d-8b87-e4effd7c6de4-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7sl2\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-kube-api-access-q7sl2\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.475421 master-0 kubenswrapper[4790]: I1011 10:53:03.475382 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.484296 master-0 kubenswrapper[4790]: I1011 10:53:03.478963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"1ebbbaaf-f668-4c40-b437-2e730aef3912","Type":"ContainerStarted","Data":"7f9ca70ea9150158ea01389d4f5d47ae8eb1c96eba28945e19777e7f6cd26a21"} Oct 11 10:53:03.484296 master-0 kubenswrapper[4790]: I1011 10:53:03.480373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-52t2l" event={"ID":"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c","Type":"ContainerStarted","Data":"ffd8516e1a802f15260c59750ec313118428488f3fac69d6ce8787ab8d39ef71"} Oct 11 10:53:03.484296 master-0 kubenswrapper[4790]: I1011 10:53:03.482339 4790 generic.go:334] "Generic (PLEG): container finished" podID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerID="00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa" exitCode=0 Oct 11 10:53:03.484296 master-0 kubenswrapper[4790]: I1011 10:53:03.482416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" event={"ID":"f97fbf89-0d03-4ed8-a0d2-4f796e705e20","Type":"ContainerDied","Data":"00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa"} Oct 11 10:53:03.484889 master-0 kubenswrapper[4790]: I1011 10:53:03.484603 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c13cb0d1-c50f-44fa-824a-46ece423a7cc","Type":"ContainerStarted","Data":"55aa020178354779f63bbec7883cd55060aea9bddab75564cd660ae8d36eac99"} Oct 11 10:53:03.577608 master-0 kubenswrapper[4790]: I1011 10:53:03.577532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6d71dd31-2a9e-4c3d-bff4-ee1b201f04c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^62c55997-0912-4363-a44a-b273574b90ee\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be929908-6474-451d-8b87-e4effd7c6de4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577770 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577801 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577835 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be929908-6474-451d-8b87-e4effd7c6de4-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577950 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7sl2\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-kube-api-access-q7sl2\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.578009 master-0 kubenswrapper[4790]: I1011 10:53:03.577978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.580316 master-0 kubenswrapper[4790]: I1011 10:53:03.580245 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.580569 master-0 kubenswrapper[4790]: I1011 10:53:03.580513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.580834 master-0 kubenswrapper[4790]: I1011 10:53:03.580808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.582827 master-0 kubenswrapper[4790]: I1011 10:53:03.582686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.586675 master-0 kubenswrapper[4790]: I1011 10:53:03.584964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be929908-6474-451d-8b87-e4effd7c6de4-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.586675 master-0 kubenswrapper[4790]: I1011 10:53:03.585561 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:53:03.586675 master-0 kubenswrapper[4790]: I1011 10:53:03.585592 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6d71dd31-2a9e-4c3d-bff4-ee1b201f04c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^62c55997-0912-4363-a44a-b273574b90ee\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0325335efdc6373b1da0b492b8c5ad80b94f8a3a314c03c4818d28b6fb013145/globalmount\"" pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.586675 master-0 kubenswrapper[4790]: I1011 10:53:03.586587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be929908-6474-451d-8b87-e4effd7c6de4-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.589672 master-0 kubenswrapper[4790]: I1011 10:53:03.589612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.591258 master-0 kubenswrapper[4790]: I1011 10:53:03.589955 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be929908-6474-451d-8b87-e4effd7c6de4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.591398 master-0 kubenswrapper[4790]: I1011 10:53:03.591293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.603781 master-0 kubenswrapper[4790]: I1011 10:53:03.603695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7sl2\" (UniqueName: \"kubernetes.io/projected/be929908-6474-451d-8b87-e4effd7c6de4-kube-api-access-q7sl2\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:03.693462 master-0 kubenswrapper[4790]: E1011 10:53:03.693282 4790 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 11 10:53:03.693462 master-0 kubenswrapper[4790]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f97fbf89-0d03-4ed8-a0d2-4f796e705e20/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 11 10:53:03.693462 master-0 kubenswrapper[4790]: > podSandboxID="a4a9276558748dc6921cea20229a43f4acf57679858b58aec74901d23d4a131c" Oct 11 10:53:03.695366 master-0 kubenswrapper[4790]: E1011 10:53:03.693539 4790 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 10:53:03.695366 master-0 kubenswrapper[4790]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58fh579h64dh56ch657h674h656h9fh547h5hf7hc6h557hfdh566h66fh69h5cdhfh59fh58ch678h587h68ch675h6ch559h5f4h549h5f7h56fh586q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wx2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000790000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6944757b7f-plhvq_openstack(f97fbf89-0d03-4ed8-a0d2-4f796e705e20): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f97fbf89-0d03-4ed8-a0d2-4f796e705e20/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 11 10:53:03.695366 master-0 kubenswrapper[4790]: > logger="UnhandledError" Oct 11 10:53:03.695366 master-0 kubenswrapper[4790]: E1011 10:53:03.694945 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f97fbf89-0d03-4ed8-a0d2-4f796e705e20/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" Oct 11 10:53:03.722323 master-0 kubenswrapper[4790]: I1011 10:53:03.720684 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-82rlt"] Oct 11 10:53:04.100413 master-0 kubenswrapper[4790]: I1011 10:53:04.100376 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dw8wx"] Oct 11 10:53:04.493228 master-0 kubenswrapper[4790]: I1011 10:53:04.493177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dw8wx" event={"ID":"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0","Type":"ContainerStarted","Data":"62cb6b7233fe70010bd1bf3163750a8ee8f35a71e5521a83b5ec97f307536726"} Oct 11 10:53:04.495944 master-0 kubenswrapper[4790]: I1011 10:53:04.495837 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-82rlt" event={"ID":"f4c572ba-fc98-4468-939a-bbe0eadb7b63","Type":"ContainerStarted","Data":"f6bc5207e4efea93fc1bba72f36dea738790b37fd7ecba635a7f8b806c0ce82e"} Oct 11 10:53:05.150634 master-0 kubenswrapper[4790]: I1011 10:53:05.150584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6d71dd31-2a9e-4c3d-bff4-ee1b201f04c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^62c55997-0912-4363-a44a-b273574b90ee\") pod \"rabbitmq-cell1-server-1\" (UID: \"be929908-6474-451d-8b87-e4effd7c6de4\") " pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:05.511365 master-0 kubenswrapper[4790]: I1011 10:53:05.510131 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:10.366548 master-0 kubenswrapper[4790]: I1011 10:53:10.366380 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-1"] Oct 11 10:53:10.367941 master-0 kubenswrapper[4790]: I1011 10:53:10.367913 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-1" Oct 11 10:53:10.371280 master-0 kubenswrapper[4790]: I1011 10:53:10.371247 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 11 10:53:10.371578 master-0 kubenswrapper[4790]: I1011 10:53:10.371530 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 11 10:53:10.371820 master-0 kubenswrapper[4790]: I1011 10:53:10.371794 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 11 10:53:10.372098 master-0 kubenswrapper[4790]: I1011 10:53:10.372067 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 11 10:53:10.417991 master-0 kubenswrapper[4790]: I1011 10:53:10.417922 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-1"] Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.505816 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.505882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-config-data-generated\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.505906 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bdz\" (UniqueName: \"kubernetes.io/projected/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-kube-api-access-k7bdz\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.505933 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-secrets\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.506006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c3a1a9c-ad77-4ede-a234-173033baae18\" (UniqueName: \"kubernetes.io/csi/topolvm.io^af3ccd44-89a2-4e58-bc75-402f4b9b5935\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.506028 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-config-data-default\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.506491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-kolla-config\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.506588 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-operator-scripts\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.506756 master-0 kubenswrapper[4790]: I1011 10:53:10.506668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608065 master-0 kubenswrapper[4790]: I1011 10:53:10.608006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-kolla-config\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608065 master-0 kubenswrapper[4790]: I1011 10:53:10.608078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-operator-scripts\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bdz\" (UniqueName: \"kubernetes.io/projected/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-kube-api-access-k7bdz\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-config-data-generated\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-secrets\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c3a1a9c-ad77-4ede-a234-173033baae18\" (UniqueName: \"kubernetes.io/csi/topolvm.io^af3ccd44-89a2-4e58-bc75-402f4b9b5935\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.608341 master-0 kubenswrapper[4790]: I1011 10:53:10.608317 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-config-data-default\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.609098 master-0 kubenswrapper[4790]: I1011 10:53:10.609063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-config-data-generated\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.609377 master-0 kubenswrapper[4790]: I1011 10:53:10.609281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-kolla-config\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.609693 master-0 kubenswrapper[4790]: I1011 10:53:10.609639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-config-data-default\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.610323 master-0 kubenswrapper[4790]: I1011 10:53:10.610246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-operator-scripts\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.612120 master-0 kubenswrapper[4790]: I1011 10:53:10.612073 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:53:10.612201 master-0 kubenswrapper[4790]: I1011 10:53:10.612129 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c3a1a9c-ad77-4ede-a234-173033baae18\" (UniqueName: \"kubernetes.io/csi/topolvm.io^af3ccd44-89a2-4e58-bc75-402f4b9b5935\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/773ba8a3ebdf6ed1b03af49c5e8575a4250fd6ceb2d7c89ab924e3ac620fe81d/globalmount\"" pod="openstack/openstack-galera-1" Oct 11 10:53:10.612685 master-0 kubenswrapper[4790]: I1011 10:53:10.612635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-secrets\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.612928 master-0 kubenswrapper[4790]: I1011 10:53:10.612880 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.614130 master-0 kubenswrapper[4790]: I1011 10:53:10.614076 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:10.680439 master-0 kubenswrapper[4790]: I1011 10:53:10.680362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bdz\" (UniqueName: \"kubernetes.io/projected/ce689fd9-58ba-45f5-bec1-ff7b79e377ac-kube-api-access-k7bdz\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:11.082261 master-0 kubenswrapper[4790]: I1011 10:53:11.081577 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-1"] Oct 11 10:53:11.134224 master-0 kubenswrapper[4790]: W1011 10:53:11.134078 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe929908_6474_451d_8b87_e4effd7c6de4.slice/crio-90758d7643e6263d46a1f152d835bc723beeacfae63fd6b9b412589cc960a694 WatchSource:0}: Error finding container 90758d7643e6263d46a1f152d835bc723beeacfae63fd6b9b412589cc960a694: Status 404 returned error can't find the container with id 90758d7643e6263d46a1f152d835bc723beeacfae63fd6b9b412589cc960a694 Oct 11 10:53:11.557954 master-0 kubenswrapper[4790]: I1011 10:53:11.557874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-52t2l" event={"ID":"8c164a4b-a2d5-4570-aed3-86dbb1f3d47c","Type":"ContainerStarted","Data":"b5a30090d52e04bd1585718bea7397f834cc2eb435df5a374c553c8fcde615e5"} Oct 11 10:53:11.558760 master-0 kubenswrapper[4790]: I1011 10:53:11.558006 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-52t2l" Oct 11 10:53:11.561260 master-0 kubenswrapper[4790]: I1011 10:53:11.561016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dw8wx" event={"ID":"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0","Type":"ContainerStarted","Data":"ba831bf6dddeb3f64993c649a1f54543cd4a62c4464b0ea7cd2cf9d325aa66f4"} Oct 11 10:53:11.563207 master-0 kubenswrapper[4790]: I1011 10:53:11.563142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"be929908-6474-451d-8b87-e4effd7c6de4","Type":"ContainerStarted","Data":"90758d7643e6263d46a1f152d835bc723beeacfae63fd6b9b412589cc960a694"} Oct 11 10:53:11.566465 master-0 kubenswrapper[4790]: I1011 10:53:11.566428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" event={"ID":"f97fbf89-0d03-4ed8-a0d2-4f796e705e20","Type":"ContainerStarted","Data":"92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2"} Oct 11 10:53:11.567652 master-0 kubenswrapper[4790]: I1011 10:53:11.567631 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:53:11.570897 master-0 kubenswrapper[4790]: I1011 10:53:11.570830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-82rlt" event={"ID":"f4c572ba-fc98-4468-939a-bbe0eadb7b63","Type":"ContainerStarted","Data":"89db518426b8fdbecdfe04227dd0757e12bf32a3706a5be2c7e5de68c3e46acd"} Oct 11 10:53:11.709073 master-0 kubenswrapper[4790]: I1011 10:53:11.708974 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-52t2l" podStartSLOduration=2.739323252 podStartE2EDuration="10.70894888s" podCreationTimestamp="2025-10-11 10:53:01 +0000 UTC" firstStartedPulling="2025-10-11 10:53:02.823870941 +0000 UTC m=+859.378331233" lastFinishedPulling="2025-10-11 10:53:10.793496569 +0000 UTC m=+867.347956861" observedRunningTime="2025-10-11 10:53:11.706860873 +0000 UTC m=+868.261321245" watchObservedRunningTime="2025-10-11 10:53:11.70894888 +0000 UTC m=+868.263409172" Oct 11 10:53:11.888975 master-0 kubenswrapper[4790]: I1011 10:53:11.888877 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c3a1a9c-ad77-4ede-a234-173033baae18\" (UniqueName: \"kubernetes.io/csi/topolvm.io^af3ccd44-89a2-4e58-bc75-402f4b9b5935\") pod \"openstack-galera-1\" (UID: \"ce689fd9-58ba-45f5-bec1-ff7b79e377ac\") " pod="openstack/openstack-galera-1" Oct 11 10:53:12.080860 master-0 kubenswrapper[4790]: I1011 10:53:12.080777 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6944757b7f-plhvq"] Oct 11 10:53:12.197132 master-0 kubenswrapper[4790]: I1011 10:53:12.196985 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-1" Oct 11 10:53:12.265309 master-0 kubenswrapper[4790]: I1011 10:53:12.265197 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-82rlt" podStartSLOduration=3.014561065 podStartE2EDuration="10.265173926s" podCreationTimestamp="2025-10-11 10:53:02 +0000 UTC" firstStartedPulling="2025-10-11 10:53:03.79662037 +0000 UTC m=+860.351080662" lastFinishedPulling="2025-10-11 10:53:11.047233231 +0000 UTC m=+867.601693523" observedRunningTime="2025-10-11 10:53:12.265083494 +0000 UTC m=+868.819543796" watchObservedRunningTime="2025-10-11 10:53:12.265173926 +0000 UTC m=+868.819634218" Oct 11 10:53:12.560211 master-0 kubenswrapper[4790]: I1011 10:53:12.559480 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" podStartSLOduration=11.347103393 podStartE2EDuration="24.559447762s" podCreationTimestamp="2025-10-11 10:52:48 +0000 UTC" firstStartedPulling="2025-10-11 10:52:49.019844924 +0000 UTC m=+845.574305226" lastFinishedPulling="2025-10-11 10:53:02.232189283 +0000 UTC m=+858.786649595" observedRunningTime="2025-10-11 10:53:12.552366596 +0000 UTC m=+869.106826888" watchObservedRunningTime="2025-10-11 10:53:12.559447762 +0000 UTC m=+869.113908074" Oct 11 10:53:12.581340 master-0 kubenswrapper[4790]: I1011 10:53:12.581262 4790 generic.go:334] "Generic (PLEG): container finished" podID="c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0" containerID="ba831bf6dddeb3f64993c649a1f54543cd4a62c4464b0ea7cd2cf9d325aa66f4" exitCode=0 Oct 11 10:53:12.581605 master-0 kubenswrapper[4790]: I1011 10:53:12.581367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dw8wx" event={"ID":"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0","Type":"ContainerDied","Data":"ba831bf6dddeb3f64993c649a1f54543cd4a62c4464b0ea7cd2cf9d325aa66f4"} Oct 11 10:53:12.583566 master-0 kubenswrapper[4790]: I1011 10:53:12.583522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"be929908-6474-451d-8b87-e4effd7c6de4","Type":"ContainerStarted","Data":"ec0350a7355f6c5da814acb3e4b50a985a39478e4987d701f3bdf168b3a6530a"} Oct 11 10:53:12.585689 master-0 kubenswrapper[4790]: I1011 10:53:12.585624 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c13cb0d1-c50f-44fa-824a-46ece423a7cc","Type":"ContainerStarted","Data":"c4be72b5de1183ec2ba8473fed3fe3c9aa390d6a7ab90458f3ecfc26bf72839f"} Oct 11 10:53:13.006364 master-0 kubenswrapper[4790]: I1011 10:53:13.006307 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-1"] Oct 11 10:53:13.013172 master-0 kubenswrapper[4790]: W1011 10:53:13.013120 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce689fd9_58ba_45f5_bec1_ff7b79e377ac.slice/crio-4fbef24be0c8b38f770cf0fc87b71794ae38966ded791ade342b14b09efe4324 WatchSource:0}: Error finding container 4fbef24be0c8b38f770cf0fc87b71794ae38966ded791ade342b14b09efe4324: Status 404 returned error can't find the container with id 4fbef24be0c8b38f770cf0fc87b71794ae38966ded791ade342b14b09efe4324 Oct 11 10:53:13.598552 master-0 kubenswrapper[4790]: I1011 10:53:13.598464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"ce689fd9-58ba-45f5-bec1-ff7b79e377ac","Type":"ContainerStarted","Data":"4fbef24be0c8b38f770cf0fc87b71794ae38966ded791ade342b14b09efe4324"} Oct 11 10:53:13.608318 master-0 kubenswrapper[4790]: I1011 10:53:13.608232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dw8wx" event={"ID":"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0","Type":"ContainerStarted","Data":"e05059712b2d4bb66b7f19403495dbad2195c6a1d67625b3268de7b4d7fdeb2b"} Oct 11 10:53:13.608536 master-0 kubenswrapper[4790]: I1011 10:53:13.608340 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:13.608536 master-0 kubenswrapper[4790]: I1011 10:53:13.608354 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dw8wx" event={"ID":"c3f53fa4-305a-45b2-8cf3-f4e97d6a5ea0","Type":"ContainerStarted","Data":"f451a78016f5347b80416239e80aefa101a979e977d58b1ffeeba6fb0a2ad963"} Oct 11 10:53:13.610830 master-0 kubenswrapper[4790]: I1011 10:53:13.610549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"1ebbbaaf-f668-4c40-b437-2e730aef3912","Type":"ContainerStarted","Data":"dc657647e19f7fc9de9910df8248a796ef7cb75c8ad8bbbaa74f15d4511985e3"} Oct 11 10:53:13.610830 master-0 kubenswrapper[4790]: I1011 10:53:13.610750 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerName="dnsmasq-dns" containerID="cri-o://92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2" gracePeriod=10 Oct 11 10:53:13.653033 master-0 kubenswrapper[4790]: I1011 10:53:13.652919 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dw8wx" podStartSLOduration=5.96967277 podStartE2EDuration="12.652896096s" podCreationTimestamp="2025-10-11 10:53:01 +0000 UTC" firstStartedPulling="2025-10-11 10:53:04.11015318 +0000 UTC m=+860.664613472" lastFinishedPulling="2025-10-11 10:53:10.793376506 +0000 UTC m=+867.347836798" observedRunningTime="2025-10-11 10:53:13.651407746 +0000 UTC m=+870.205868038" watchObservedRunningTime="2025-10-11 10:53:13.652896096 +0000 UTC m=+870.207356378" Oct 11 10:53:13.887545 master-0 kubenswrapper[4790]: I1011 10:53:13.887463 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-2"] Oct 11 10:53:13.890523 master-0 kubenswrapper[4790]: I1011 10:53:13.890465 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.893105 master-0 kubenswrapper[4790]: I1011 10:53:13.893047 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 11 10:53:13.893897 master-0 kubenswrapper[4790]: I1011 10:53:13.893842 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 11 10:53:13.902376 master-0 kubenswrapper[4790]: I1011 10:53:13.902308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-2"] Oct 11 10:53:13.906260 master-0 kubenswrapper[4790]: I1011 10:53:13.906213 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 11 10:53:13.987537 master-0 kubenswrapper[4790]: I1011 10:53:13.987415 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987537 master-0 kubenswrapper[4790]: I1011 10:53:13.987517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-secrets\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987578 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987616 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf6h8\" (UniqueName: \"kubernetes.io/projected/5059e0b0-120f-4498-8076-e3e9239b5688-kube-api-access-pf6h8\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987640 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c45139fd-bd49-4792-a4db-ab4cf7143532\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0ad3b211-066d-4f82-9b08-1cd78cfd71a3\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5059e0b0-120f-4498-8076-e3e9239b5688-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:13.987820 master-0 kubenswrapper[4790]: I1011 10:53:13.987770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.089888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.089948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.089996 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf6h8\" (UniqueName: \"kubernetes.io/projected/5059e0b0-120f-4498-8076-e3e9239b5688-kube-api-access-pf6h8\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.090132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.090169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c45139fd-bd49-4792-a4db-ab4cf7143532\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0ad3b211-066d-4f82-9b08-1cd78cfd71a3\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.090230 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5059e0b0-120f-4498-8076-e3e9239b5688-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.090267 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.090302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.090365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-secrets\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.092956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.094512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-secrets\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.094962 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.095030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5059e0b0-120f-4498-8076-e3e9239b5688-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.104950 master-0 kubenswrapper[4790]: I1011 10:53:14.095648 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5059e0b0-120f-4498-8076-e3e9239b5688-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.109811 master-0 kubenswrapper[4790]: I1011 10:53:14.108612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.110224 master-0 kubenswrapper[4790]: I1011 10:53:14.110168 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:53:14.110312 master-0 kubenswrapper[4790]: I1011 10:53:14.110244 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c45139fd-bd49-4792-a4db-ab4cf7143532\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0ad3b211-066d-4f82-9b08-1cd78cfd71a3\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2d27f890eb1486f45bcfb322dc6233cd405573b410a561ca95e0dd6cc109f5f4/globalmount\"" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.123377 master-0 kubenswrapper[4790]: I1011 10:53:14.122829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5059e0b0-120f-4498-8076-e3e9239b5688-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.123682 master-0 kubenswrapper[4790]: I1011 10:53:14.123641 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf6h8\" (UniqueName: \"kubernetes.io/projected/5059e0b0-120f-4498-8076-e3e9239b5688-kube-api-access-pf6h8\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:14.157844 master-0 kubenswrapper[4790]: I1011 10:53:14.157436 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:53:14.193494 master-0 kubenswrapper[4790]: I1011 10:53:14.193016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wx2k\" (UniqueName: \"kubernetes.io/projected/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-kube-api-access-7wx2k\") pod \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " Oct 11 10:53:14.193494 master-0 kubenswrapper[4790]: I1011 10:53:14.193132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-config\") pod \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " Oct 11 10:53:14.193494 master-0 kubenswrapper[4790]: I1011 10:53:14.193231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-dns-svc\") pod \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\" (UID: \"f97fbf89-0d03-4ed8-a0d2-4f796e705e20\") " Oct 11 10:53:14.212090 master-0 kubenswrapper[4790]: I1011 10:53:14.197847 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-kube-api-access-7wx2k" (OuterVolumeSpecName: "kube-api-access-7wx2k") pod "f97fbf89-0d03-4ed8-a0d2-4f796e705e20" (UID: "f97fbf89-0d03-4ed8-a0d2-4f796e705e20"). InnerVolumeSpecName "kube-api-access-7wx2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:53:14.212871 master-0 kubenswrapper[4790]: I1011 10:53:14.212825 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wx2k\" (UniqueName: \"kubernetes.io/projected/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-kube-api-access-7wx2k\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:14.230645 master-0 kubenswrapper[4790]: I1011 10:53:14.230569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-config" (OuterVolumeSpecName: "config") pod "f97fbf89-0d03-4ed8-a0d2-4f796e705e20" (UID: "f97fbf89-0d03-4ed8-a0d2-4f796e705e20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:14.233139 master-0 kubenswrapper[4790]: I1011 10:53:14.233120 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f97fbf89-0d03-4ed8-a0d2-4f796e705e20" (UID: "f97fbf89-0d03-4ed8-a0d2-4f796e705e20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:14.316941 master-0 kubenswrapper[4790]: I1011 10:53:14.316874 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:14.316941 master-0 kubenswrapper[4790]: I1011 10:53:14.316926 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f97fbf89-0d03-4ed8-a0d2-4f796e705e20-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:14.620022 master-0 kubenswrapper[4790]: I1011 10:53:14.619965 4790 generic.go:334] "Generic (PLEG): container finished" podID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerID="92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2" exitCode=0 Oct 11 10:53:14.621212 master-0 kubenswrapper[4790]: I1011 10:53:14.620027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" event={"ID":"f97fbf89-0d03-4ed8-a0d2-4f796e705e20","Type":"ContainerDied","Data":"92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2"} Oct 11 10:53:14.621323 master-0 kubenswrapper[4790]: I1011 10:53:14.621308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" event={"ID":"f97fbf89-0d03-4ed8-a0d2-4f796e705e20","Type":"ContainerDied","Data":"a4a9276558748dc6921cea20229a43f4acf57679858b58aec74901d23d4a131c"} Oct 11 10:53:14.621398 master-0 kubenswrapper[4790]: I1011 10:53:14.621387 4790 scope.go:117] "RemoveContainer" containerID="92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2" Oct 11 10:53:14.621498 master-0 kubenswrapper[4790]: I1011 10:53:14.621486 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:14.621562 master-0 kubenswrapper[4790]: I1011 10:53:14.620067 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944757b7f-plhvq" Oct 11 10:53:14.639875 master-0 kubenswrapper[4790]: I1011 10:53:14.639845 4790 scope.go:117] "RemoveContainer" containerID="00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa" Oct 11 10:53:14.658485 master-0 kubenswrapper[4790]: I1011 10:53:14.658418 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6944757b7f-plhvq"] Oct 11 10:53:14.663402 master-0 kubenswrapper[4790]: I1011 10:53:14.663381 4790 scope.go:117] "RemoveContainer" containerID="92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2" Oct 11 10:53:14.663745 master-0 kubenswrapper[4790]: E1011 10:53:14.663705 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2\": container with ID starting with 92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2 not found: ID does not exist" containerID="92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2" Oct 11 10:53:14.663805 master-0 kubenswrapper[4790]: I1011 10:53:14.663756 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2"} err="failed to get container status \"92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2\": rpc error: code = NotFound desc = could not find container \"92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2\": container with ID starting with 92b71e314ad33488ce83aa439ada543420fe070228f28b65e21677eb1d95f2b2 not found: ID does not exist" Oct 11 10:53:14.663805 master-0 kubenswrapper[4790]: I1011 10:53:14.663783 4790 scope.go:117] "RemoveContainer" containerID="00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa" Oct 11 10:53:14.664077 master-0 kubenswrapper[4790]: E1011 10:53:14.664056 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa\": container with ID starting with 00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa not found: ID does not exist" containerID="00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa" Oct 11 10:53:14.664134 master-0 kubenswrapper[4790]: I1011 10:53:14.664080 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa"} err="failed to get container status \"00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa\": rpc error: code = NotFound desc = could not find container \"00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa\": container with ID starting with 00ae648526dd85e825feb5e90f78264a37eaa038e7b0cc07f203779fbd465caa not found: ID does not exist" Oct 11 10:53:14.710460 master-0 kubenswrapper[4790]: I1011 10:53:14.710336 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6944757b7f-plhvq"] Oct 11 10:53:15.400885 master-0 kubenswrapper[4790]: I1011 10:53:15.400831 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c45139fd-bd49-4792-a4db-ab4cf7143532\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0ad3b211-066d-4f82-9b08-1cd78cfd71a3\") pod \"openstack-cell1-galera-2\" (UID: \"5059e0b0-120f-4498-8076-e3e9239b5688\") " pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:15.710838 master-0 kubenswrapper[4790]: I1011 10:53:15.710545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:15.803194 master-0 kubenswrapper[4790]: I1011 10:53:15.803116 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 11 10:53:15.803466 master-0 kubenswrapper[4790]: E1011 10:53:15.803386 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerName="dnsmasq-dns" Oct 11 10:53:15.803466 master-0 kubenswrapper[4790]: I1011 10:53:15.803405 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerName="dnsmasq-dns" Oct 11 10:53:15.803466 master-0 kubenswrapper[4790]: E1011 10:53:15.803428 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerName="init" Oct 11 10:53:15.803466 master-0 kubenswrapper[4790]: I1011 10:53:15.803438 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerName="init" Oct 11 10:53:15.803628 master-0 kubenswrapper[4790]: I1011 10:53:15.803558 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" containerName="dnsmasq-dns" Oct 11 10:53:15.804355 master-0 kubenswrapper[4790]: I1011 10:53:15.804332 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.813183 master-0 kubenswrapper[4790]: I1011 10:53:15.811982 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 11 10:53:15.813183 master-0 kubenswrapper[4790]: I1011 10:53:15.812018 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 11 10:53:15.813183 master-0 kubenswrapper[4790]: I1011 10:53:15.812018 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 11 10:53:15.820270 master-0 kubenswrapper[4790]: I1011 10:53:15.819777 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842212 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d84ced78-2c7e-4de1-b3cb-1356f0f0b1e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1409533a-19dc-4ab6-9c9b-35e2a16fe065\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgl8\" (UniqueName: \"kubernetes.io/projected/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-kube-api-access-9qgl8\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-config\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842379 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842412 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.842533 master-0 kubenswrapper[4790]: I1011 10:53:15.842437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.943987 master-0 kubenswrapper[4790]: I1011 10:53:15.943894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.943987 master-0 kubenswrapper[4790]: I1011 10:53:15.943954 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d84ced78-2c7e-4de1-b3cb-1356f0f0b1e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1409533a-19dc-4ab6-9c9b-35e2a16fe065\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.943987 master-0 kubenswrapper[4790]: I1011 10:53:15.943984 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgl8\" (UniqueName: \"kubernetes.io/projected/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-kube-api-access-9qgl8\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.943987 master-0 kubenswrapper[4790]: I1011 10:53:15.944004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.944513 master-0 kubenswrapper[4790]: I1011 10:53:15.944026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-config\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.944513 master-0 kubenswrapper[4790]: I1011 10:53:15.944054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.944513 master-0 kubenswrapper[4790]: I1011 10:53:15.944082 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.944513 master-0 kubenswrapper[4790]: I1011 10:53:15.944108 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.947939 master-0 kubenswrapper[4790]: I1011 10:53:15.947890 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.951342 master-0 kubenswrapper[4790]: I1011 10:53:15.950410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.951750 master-0 kubenswrapper[4790]: I1011 10:53:15.951522 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-config\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.951750 master-0 kubenswrapper[4790]: I1011 10:53:15.951586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.954114 master-0 kubenswrapper[4790]: I1011 10:53:15.953542 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:53:15.954114 master-0 kubenswrapper[4790]: I1011 10:53:15.953602 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d84ced78-2c7e-4de1-b3cb-1356f0f0b1e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1409533a-19dc-4ab6-9c9b-35e2a16fe065\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/30d66e2fad6d0ab5943ae8ba29bb4bba33e71d3d5a88128992813110331d9b74/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.954528 master-0 kubenswrapper[4790]: I1011 10:53:15.954448 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.966574 master-0 kubenswrapper[4790]: I1011 10:53:15.966450 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:15.972289 master-0 kubenswrapper[4790]: I1011 10:53:15.972026 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgl8\" (UniqueName: \"kubernetes.io/projected/c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70-kube-api-access-9qgl8\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:16.302302 master-0 kubenswrapper[4790]: I1011 10:53:16.302243 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f97fbf89-0d03-4ed8-a0d2-4f796e705e20" path="/var/lib/kubelet/pods/f97fbf89-0d03-4ed8-a0d2-4f796e705e20/volumes" Oct 11 10:53:17.366519 master-0 kubenswrapper[4790]: I1011 10:53:17.365542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d84ced78-2c7e-4de1-b3cb-1356f0f0b1e1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1409533a-19dc-4ab6-9c9b-35e2a16fe065\") pod \"ovsdbserver-nb-0\" (UID: \"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70\") " pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:17.639518 master-0 kubenswrapper[4790]: I1011 10:53:17.639426 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:17.900933 master-0 kubenswrapper[4790]: I1011 10:53:17.900866 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-2"] Oct 11 10:53:17.907361 master-0 kubenswrapper[4790]: W1011 10:53:17.907302 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5059e0b0_120f_4498_8076_e3e9239b5688.slice/crio-b7980a4d132fab58838654556376c0979ed90f167353b3483e55d73bcabd1c9d WatchSource:0}: Error finding container b7980a4d132fab58838654556376c0979ed90f167353b3483e55d73bcabd1c9d: Status 404 returned error can't find the container with id b7980a4d132fab58838654556376c0979ed90f167353b3483e55d73bcabd1c9d Oct 11 10:53:18.220417 master-0 kubenswrapper[4790]: I1011 10:53:18.220358 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 11 10:53:18.224965 master-0 kubenswrapper[4790]: W1011 10:53:18.224891 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8612b0c_02a4_40ef_b5e7_71a8b8d6fe70.slice/crio-ffd70f6e5ebed6781320f8727f067080b40abc5ca784a9bc6515e0b0d19646ef WatchSource:0}: Error finding container ffd70f6e5ebed6781320f8727f067080b40abc5ca784a9bc6515e0b0d19646ef: Status 404 returned error can't find the container with id ffd70f6e5ebed6781320f8727f067080b40abc5ca784a9bc6515e0b0d19646ef Oct 11 10:53:18.657617 master-0 kubenswrapper[4790]: I1011 10:53:18.657531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"5059e0b0-120f-4498-8076-e3e9239b5688","Type":"ContainerStarted","Data":"16951753cab39fff859f3907a86a88a98732302e37279ed54b73a1c204defd9f"} Oct 11 10:53:18.657617 master-0 kubenswrapper[4790]: I1011 10:53:18.657627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"5059e0b0-120f-4498-8076-e3e9239b5688","Type":"ContainerStarted","Data":"b7980a4d132fab58838654556376c0979ed90f167353b3483e55d73bcabd1c9d"} Oct 11 10:53:18.660908 master-0 kubenswrapper[4790]: I1011 10:53:18.660863 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"ce689fd9-58ba-45f5-bec1-ff7b79e377ac","Type":"ContainerStarted","Data":"c48a5ea5685722211bada29d1dc5f400eb9f08b775cb3b65b8aa9fa7c05afb6a"} Oct 11 10:53:18.662458 master-0 kubenswrapper[4790]: I1011 10:53:18.662399 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70","Type":"ContainerStarted","Data":"ffd70f6e5ebed6781320f8727f067080b40abc5ca784a9bc6515e0b0d19646ef"} Oct 11 10:53:19.677985 master-0 kubenswrapper[4790]: I1011 10:53:19.677908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70","Type":"ContainerStarted","Data":"c825292e80a7f9119d113b3bb262f7f111d6f5ad5e4978b8a04781a9d2fb1028"} Oct 11 10:53:20.689485 master-0 kubenswrapper[4790]: I1011 10:53:20.689338 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ebbbaaf-f668-4c40-b437-2e730aef3912" containerID="dc657647e19f7fc9de9910df8248a796ef7cb75c8ad8bbbaa74f15d4511985e3" exitCode=0 Oct 11 10:53:20.689485 master-0 kubenswrapper[4790]: I1011 10:53:20.689464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"1ebbbaaf-f668-4c40-b437-2e730aef3912","Type":"ContainerDied","Data":"dc657647e19f7fc9de9910df8248a796ef7cb75c8ad8bbbaa74f15d4511985e3"} Oct 11 10:53:20.694694 master-0 kubenswrapper[4790]: I1011 10:53:20.694465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c8612b0c-02a4-40ef-b5e7-71a8b8d6fe70","Type":"ContainerStarted","Data":"26985baa1c4b8f8f21210ddc96b676ad82501b72c746d0d846d399ec0fb50b76"} Oct 11 10:53:20.770509 master-0 kubenswrapper[4790]: I1011 10:53:20.770404 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.530043721 podStartE2EDuration="20.770378617s" podCreationTimestamp="2025-10-11 10:53:00 +0000 UTC" firstStartedPulling="2025-10-11 10:53:18.228899331 +0000 UTC m=+874.783359623" lastFinishedPulling="2025-10-11 10:53:19.469234217 +0000 UTC m=+876.023694519" observedRunningTime="2025-10-11 10:53:20.762669913 +0000 UTC m=+877.317130235" watchObservedRunningTime="2025-10-11 10:53:20.770378617 +0000 UTC m=+877.324838929" Oct 11 10:53:21.704743 master-0 kubenswrapper[4790]: I1011 10:53:21.704653 4790 generic.go:334] "Generic (PLEG): container finished" podID="ce689fd9-58ba-45f5-bec1-ff7b79e377ac" containerID="c48a5ea5685722211bada29d1dc5f400eb9f08b775cb3b65b8aa9fa7c05afb6a" exitCode=0 Oct 11 10:53:21.705565 master-0 kubenswrapper[4790]: I1011 10:53:21.704752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"ce689fd9-58ba-45f5-bec1-ff7b79e377ac","Type":"ContainerDied","Data":"c48a5ea5685722211bada29d1dc5f400eb9f08b775cb3b65b8aa9fa7c05afb6a"} Oct 11 10:53:22.091807 master-0 kubenswrapper[4790]: I1011 10:53:22.091692 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 11 10:53:22.094507 master-0 kubenswrapper[4790]: I1011 10:53:22.094474 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.097786 master-0 kubenswrapper[4790]: I1011 10:53:22.097700 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 11 10:53:22.097920 master-0 kubenswrapper[4790]: I1011 10:53:22.097699 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 11 10:53:22.105445 master-0 kubenswrapper[4790]: I1011 10:53:22.105298 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 11 10:53:22.110780 master-0 kubenswrapper[4790]: I1011 10:53:22.110508 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 11 10:53:22.261667 master-0 kubenswrapper[4790]: I1011 10:53:22.260921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/403c433e-e6f3-4732-9d44-95e68ac5d36d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.261667 master-0 kubenswrapper[4790]: I1011 10:53:22.261162 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.261667 master-0 kubenswrapper[4790]: I1011 10:53:22.261383 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-888b07c6-0727-400b-ade1-5d81d76c7ca4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6484ec26-9b3d-4dbd-a479-93aa7b5ec920\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.261667 master-0 kubenswrapper[4790]: I1011 10:53:22.261536 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.262092 master-0 kubenswrapper[4790]: I1011 10:53:22.261765 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/403c433e-e6f3-4732-9d44-95e68ac5d36d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.262092 master-0 kubenswrapper[4790]: I1011 10:53:22.261868 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.262092 master-0 kubenswrapper[4790]: I1011 10:53:22.262048 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbj45\" (UniqueName: \"kubernetes.io/projected/403c433e-e6f3-4732-9d44-95e68ac5d36d-kube-api-access-rbj45\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.262228 master-0 kubenswrapper[4790]: I1011 10:53:22.262105 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403c433e-e6f3-4732-9d44-95e68ac5d36d-config\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364238 master-0 kubenswrapper[4790]: I1011 10:53:22.364077 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/403c433e-e6f3-4732-9d44-95e68ac5d36d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364238 master-0 kubenswrapper[4790]: I1011 10:53:22.364161 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364238 master-0 kubenswrapper[4790]: I1011 10:53:22.364198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-888b07c6-0727-400b-ade1-5d81d76c7ca4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6484ec26-9b3d-4dbd-a479-93aa7b5ec920\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364238 master-0 kubenswrapper[4790]: I1011 10:53:22.364237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364579 master-0 kubenswrapper[4790]: I1011 10:53:22.364278 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/403c433e-e6f3-4732-9d44-95e68ac5d36d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364579 master-0 kubenswrapper[4790]: I1011 10:53:22.364303 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364579 master-0 kubenswrapper[4790]: I1011 10:53:22.364355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbj45\" (UniqueName: \"kubernetes.io/projected/403c433e-e6f3-4732-9d44-95e68ac5d36d-kube-api-access-rbj45\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364579 master-0 kubenswrapper[4790]: I1011 10:53:22.364393 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403c433e-e6f3-4732-9d44-95e68ac5d36d-config\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.364874 master-0 kubenswrapper[4790]: I1011 10:53:22.364822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/403c433e-e6f3-4732-9d44-95e68ac5d36d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.365924 master-0 kubenswrapper[4790]: I1011 10:53:22.365899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/403c433e-e6f3-4732-9d44-95e68ac5d36d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.366437 master-0 kubenswrapper[4790]: I1011 10:53:22.366389 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403c433e-e6f3-4732-9d44-95e68ac5d36d-config\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.369492 master-0 kubenswrapper[4790]: I1011 10:53:22.369466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.369566 master-0 kubenswrapper[4790]: I1011 10:53:22.369540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.369755 master-0 kubenswrapper[4790]: I1011 10:53:22.369696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/403c433e-e6f3-4732-9d44-95e68ac5d36d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.369812 master-0 kubenswrapper[4790]: I1011 10:53:22.369790 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:53:22.369875 master-0 kubenswrapper[4790]: I1011 10:53:22.369846 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-888b07c6-0727-400b-ade1-5d81d76c7ca4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6484ec26-9b3d-4dbd-a479-93aa7b5ec920\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/f6c73a3ac7e9ba275a8482e3555a12863487c6a78cc7244a68d6b0801064717f/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.398836 master-0 kubenswrapper[4790]: I1011 10:53:22.398779 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbj45\" (UniqueName: \"kubernetes.io/projected/403c433e-e6f3-4732-9d44-95e68ac5d36d-kube-api-access-rbj45\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:22.639824 master-0 kubenswrapper[4790]: I1011 10:53:22.639620 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:22.725478 master-0 kubenswrapper[4790]: I1011 10:53:22.725411 4790 generic.go:334] "Generic (PLEG): container finished" podID="5059e0b0-120f-4498-8076-e3e9239b5688" containerID="16951753cab39fff859f3907a86a88a98732302e37279ed54b73a1c204defd9f" exitCode=0 Oct 11 10:53:22.726208 master-0 kubenswrapper[4790]: I1011 10:53:22.725496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"5059e0b0-120f-4498-8076-e3e9239b5688","Type":"ContainerDied","Data":"16951753cab39fff859f3907a86a88a98732302e37279ed54b73a1c204defd9f"} Oct 11 10:53:22.737775 master-0 kubenswrapper[4790]: I1011 10:53:22.734786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"ce689fd9-58ba-45f5-bec1-ff7b79e377ac","Type":"ContainerStarted","Data":"218379a1cbe6c195eee20a66e0037e4a1d22bbf093737b2701289a4711a702af"} Oct 11 10:53:23.640105 master-0 kubenswrapper[4790]: I1011 10:53:23.640042 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:23.682737 master-0 kubenswrapper[4790]: I1011 10:53:23.682670 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:23.720121 master-0 kubenswrapper[4790]: I1011 10:53:23.720021 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-1" podStartSLOduration=27.223252139 podStartE2EDuration="31.719995306s" podCreationTimestamp="2025-10-11 10:52:52 +0000 UTC" firstStartedPulling="2025-10-11 10:53:13.017481127 +0000 UTC m=+869.571941419" lastFinishedPulling="2025-10-11 10:53:17.514224284 +0000 UTC m=+874.068684586" observedRunningTime="2025-10-11 10:53:22.82061776 +0000 UTC m=+879.375078072" watchObservedRunningTime="2025-10-11 10:53:23.719995306 +0000 UTC m=+880.274455598" Oct 11 10:53:23.729505 master-0 kubenswrapper[4790]: I1011 10:53:23.729435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-888b07c6-0727-400b-ade1-5d81d76c7ca4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6484ec26-9b3d-4dbd-a479-93aa7b5ec920\") pod \"ovsdbserver-sb-2\" (UID: \"403c433e-e6f3-4732-9d44-95e68ac5d36d\") " pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:23.742201 master-0 kubenswrapper[4790]: I1011 10:53:23.742124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"1ebbbaaf-f668-4c40-b437-2e730aef3912","Type":"ContainerStarted","Data":"b3ffe691ff1ca6cc8bb2407f87f44ff24c582eea93bce9c33941b8085aadde4d"} Oct 11 10:53:23.744563 master-0 kubenswrapper[4790]: I1011 10:53:23.744521 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"5059e0b0-120f-4498-8076-e3e9239b5688","Type":"ContainerStarted","Data":"c7f6c6a8ec8f298fea18ce5bdc6aa316147d7be0432d97fcbb7db7b50ab1c846"} Oct 11 10:53:23.782042 master-0 kubenswrapper[4790]: I1011 10:53:23.781945 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-2" podStartSLOduration=30.781923112 podStartE2EDuration="30.781923112s" podCreationTimestamp="2025-10-11 10:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:53:23.779378072 +0000 UTC m=+880.333838354" watchObservedRunningTime="2025-10-11 10:53:23.781923112 +0000 UTC m=+880.336383404" Oct 11 10:53:23.921414 master-0 kubenswrapper[4790]: I1011 10:53:23.921257 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:24.480745 master-0 kubenswrapper[4790]: I1011 10:53:24.478087 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 11 10:53:24.493193 master-0 kubenswrapper[4790]: W1011 10:53:24.493117 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403c433e_e6f3_4732_9d44_95e68ac5d36d.slice/crio-ace6ed88778d45d44bc00c0181b4956b0d16db77e488399fca98accbdae65b5f WatchSource:0}: Error finding container ace6ed88778d45d44bc00c0181b4956b0d16db77e488399fca98accbdae65b5f: Status 404 returned error can't find the container with id ace6ed88778d45d44bc00c0181b4956b0d16db77e488399fca98accbdae65b5f Oct 11 10:53:24.755635 master-0 kubenswrapper[4790]: I1011 10:53:24.755499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"403c433e-e6f3-4732-9d44-95e68ac5d36d","Type":"ContainerStarted","Data":"ace6ed88778d45d44bc00c0181b4956b0d16db77e488399fca98accbdae65b5f"} Oct 11 10:53:24.791727 master-0 kubenswrapper[4790]: I1011 10:53:24.791659 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 11 10:53:25.170160 master-0 kubenswrapper[4790]: I1011 10:53:25.170081 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77b589666c-j6ppm"] Oct 11 10:53:25.171542 master-0 kubenswrapper[4790]: I1011 10:53:25.171337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.176152 master-0 kubenswrapper[4790]: I1011 10:53:25.175143 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 10:53:25.176152 master-0 kubenswrapper[4790]: I1011 10:53:25.175388 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:53:25.176152 master-0 kubenswrapper[4790]: I1011 10:53:25.175538 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:53:25.193998 master-0 kubenswrapper[4790]: I1011 10:53:25.193916 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b589666c-j6ppm"] Oct 11 10:53:25.313875 master-0 kubenswrapper[4790]: I1011 10:53:25.313121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-ovsdbserver-nb\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.313875 master-0 kubenswrapper[4790]: I1011 10:53:25.313215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-687pg\" (UniqueName: \"kubernetes.io/projected/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-kube-api-access-687pg\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.313875 master-0 kubenswrapper[4790]: I1011 10:53:25.313262 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-dns-svc\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.313875 master-0 kubenswrapper[4790]: I1011 10:53:25.313292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-config\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.414864 master-0 kubenswrapper[4790]: I1011 10:53:25.414794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-config\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.415113 master-0 kubenswrapper[4790]: I1011 10:53:25.414900 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-ovsdbserver-nb\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.415113 master-0 kubenswrapper[4790]: I1011 10:53:25.414951 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-687pg\" (UniqueName: \"kubernetes.io/projected/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-kube-api-access-687pg\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.415113 master-0 kubenswrapper[4790]: I1011 10:53:25.414997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-dns-svc\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.415772 master-0 kubenswrapper[4790]: I1011 10:53:25.415746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-dns-svc\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.415836 master-0 kubenswrapper[4790]: I1011 10:53:25.415753 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-config\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.416446 master-0 kubenswrapper[4790]: I1011 10:53:25.416388 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-ovsdbserver-nb\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.438754 master-0 kubenswrapper[4790]: I1011 10:53:25.438674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-687pg\" (UniqueName: \"kubernetes.io/projected/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-kube-api-access-687pg\") pod \"dnsmasq-dns-77b589666c-j6ppm\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.536124 master-0 kubenswrapper[4790]: I1011 10:53:25.536002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:25.711904 master-0 kubenswrapper[4790]: I1011 10:53:25.711846 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:25.713369 master-0 kubenswrapper[4790]: I1011 10:53:25.712767 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:25.771734 master-0 kubenswrapper[4790]: I1011 10:53:25.769775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"403c433e-e6f3-4732-9d44-95e68ac5d36d","Type":"ContainerStarted","Data":"f6cb2f899d9b65c53d73e37291a08fdecba73d9678a629ead4b84f1bd180b1f8"} Oct 11 10:53:25.774883 master-0 kubenswrapper[4790]: I1011 10:53:25.772543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"1ebbbaaf-f668-4c40-b437-2e730aef3912","Type":"ContainerStarted","Data":"3e61583376711b9abec6bb152782b40e7556f75aa64fba32d72957df4aae9f91"} Oct 11 10:53:25.805743 master-0 kubenswrapper[4790]: I1011 10:53:25.804891 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-1" podStartSLOduration=8.341887527 podStartE2EDuration="28.804871048s" podCreationTimestamp="2025-10-11 10:52:57 +0000 UTC" firstStartedPulling="2025-10-11 10:53:02.829045044 +0000 UTC m=+859.383505336" lastFinishedPulling="2025-10-11 10:53:23.292028565 +0000 UTC m=+879.846488857" observedRunningTime="2025-10-11 10:53:25.801507375 +0000 UTC m=+882.355967687" watchObservedRunningTime="2025-10-11 10:53:25.804871048 +0000 UTC m=+882.359331340" Oct 11 10:53:25.958097 master-0 kubenswrapper[4790]: W1011 10:53:25.957576 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2fc4e6a_fc2f_4ed0_a753_f5a54f83777c.slice/crio-b82148781df79646d5b0c5a4a653ab9892598923f30fe65b95aab44fedfae266 WatchSource:0}: Error finding container b82148781df79646d5b0c5a4a653ab9892598923f30fe65b95aab44fedfae266: Status 404 returned error can't find the container with id b82148781df79646d5b0c5a4a653ab9892598923f30fe65b95aab44fedfae266 Oct 11 10:53:25.959588 master-0 kubenswrapper[4790]: I1011 10:53:25.959505 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b589666c-j6ppm"] Oct 11 10:53:26.000004 master-0 kubenswrapper[4790]: I1011 10:53:25.999758 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-2"] Oct 11 10:53:26.001254 master-0 kubenswrapper[4790]: I1011 10:53:26.001224 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-2" Oct 11 10:53:26.014357 master-0 kubenswrapper[4790]: I1011 10:53:26.007989 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 11 10:53:26.014357 master-0 kubenswrapper[4790]: I1011 10:53:26.008255 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 11 10:53:26.018641 master-0 kubenswrapper[4790]: I1011 10:53:26.017945 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-2"] Oct 11 10:53:26.132186 master-0 kubenswrapper[4790]: I1011 10:53:26.132097 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d07581a-888a-4d3f-890b-550587e5657e-combined-ca-bundle\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.132767 master-0 kubenswrapper[4790]: I1011 10:53:26.132232 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d07581a-888a-4d3f-890b-550587e5657e-kolla-config\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.132767 master-0 kubenswrapper[4790]: I1011 10:53:26.132279 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d07581a-888a-4d3f-890b-550587e5657e-memcached-tls-certs\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.132767 master-0 kubenswrapper[4790]: I1011 10:53:26.132304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jpn6\" (UniqueName: \"kubernetes.io/projected/2d07581a-888a-4d3f-890b-550587e5657e-kube-api-access-7jpn6\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.132767 master-0 kubenswrapper[4790]: I1011 10:53:26.132335 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d07581a-888a-4d3f-890b-550587e5657e-config-data\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.233625 master-0 kubenswrapper[4790]: I1011 10:53:26.233471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d07581a-888a-4d3f-890b-550587e5657e-kolla-config\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.233625 master-0 kubenswrapper[4790]: I1011 10:53:26.233569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d07581a-888a-4d3f-890b-550587e5657e-memcached-tls-certs\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.233625 master-0 kubenswrapper[4790]: I1011 10:53:26.233596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jpn6\" (UniqueName: \"kubernetes.io/projected/2d07581a-888a-4d3f-890b-550587e5657e-kube-api-access-7jpn6\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.233625 master-0 kubenswrapper[4790]: I1011 10:53:26.233632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d07581a-888a-4d3f-890b-550587e5657e-config-data\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.233950 master-0 kubenswrapper[4790]: I1011 10:53:26.233658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d07581a-888a-4d3f-890b-550587e5657e-combined-ca-bundle\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.235333 master-0 kubenswrapper[4790]: I1011 10:53:26.234557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d07581a-888a-4d3f-890b-550587e5657e-kolla-config\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.235333 master-0 kubenswrapper[4790]: I1011 10:53:26.235237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d07581a-888a-4d3f-890b-550587e5657e-config-data\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.237536 master-0 kubenswrapper[4790]: I1011 10:53:26.237484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d07581a-888a-4d3f-890b-550587e5657e-memcached-tls-certs\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.239470 master-0 kubenswrapper[4790]: I1011 10:53:26.239431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d07581a-888a-4d3f-890b-550587e5657e-combined-ca-bundle\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.264455 master-0 kubenswrapper[4790]: I1011 10:53:26.264363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jpn6\" (UniqueName: \"kubernetes.io/projected/2d07581a-888a-4d3f-890b-550587e5657e-kube-api-access-7jpn6\") pod \"memcached-2\" (UID: \"2d07581a-888a-4d3f-890b-550587e5657e\") " pod="openstack/memcached-2" Oct 11 10:53:26.341942 master-0 kubenswrapper[4790]: I1011 10:53:26.341858 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-2" Oct 11 10:53:26.784165 master-0 kubenswrapper[4790]: I1011 10:53:26.784007 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"403c433e-e6f3-4732-9d44-95e68ac5d36d","Type":"ContainerStarted","Data":"861af966cb71f0799412276771ee1a44c25b273948db6102376d80a616fe7ce5"} Oct 11 10:53:26.786790 master-0 kubenswrapper[4790]: I1011 10:53:26.785540 4790 generic.go:334] "Generic (PLEG): container finished" podID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerID="5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720" exitCode=0 Oct 11 10:53:26.786790 master-0 kubenswrapper[4790]: I1011 10:53:26.785610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" event={"ID":"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c","Type":"ContainerDied","Data":"5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720"} Oct 11 10:53:26.786790 master-0 kubenswrapper[4790]: I1011 10:53:26.785645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" event={"ID":"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c","Type":"ContainerStarted","Data":"b82148781df79646d5b0c5a4a653ab9892598923f30fe65b95aab44fedfae266"} Oct 11 10:53:26.786790 master-0 kubenswrapper[4790]: I1011 10:53:26.786754 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-1" Oct 11 10:53:26.794026 master-0 kubenswrapper[4790]: I1011 10:53:26.793974 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-1" Oct 11 10:53:26.795677 master-0 kubenswrapper[4790]: I1011 10:53:26.795628 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-2"] Oct 11 10:53:26.803540 master-0 kubenswrapper[4790]: W1011 10:53:26.803478 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d07581a_888a_4d3f_890b_550587e5657e.slice/crio-07126fb93a2731bc90528e8dbe62a96535875a4d288aaa47518cc37c60307378 WatchSource:0}: Error finding container 07126fb93a2731bc90528e8dbe62a96535875a4d288aaa47518cc37c60307378: Status 404 returned error can't find the container with id 07126fb93a2731bc90528e8dbe62a96535875a4d288aaa47518cc37c60307378 Oct 11 10:53:26.822588 master-0 kubenswrapper[4790]: I1011 10:53:26.822520 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=16.800470886 podStartE2EDuration="17.822491741s" podCreationTimestamp="2025-10-11 10:53:09 +0000 UTC" firstStartedPulling="2025-10-11 10:53:24.499287434 +0000 UTC m=+881.053747726" lastFinishedPulling="2025-10-11 10:53:25.521308289 +0000 UTC m=+882.075768581" observedRunningTime="2025-10-11 10:53:26.808143503 +0000 UTC m=+883.362603825" watchObservedRunningTime="2025-10-11 10:53:26.822491741 +0000 UTC m=+883.376952033" Oct 11 10:53:26.922649 master-0 kubenswrapper[4790]: I1011 10:53:26.922429 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:27.795761 master-0 kubenswrapper[4790]: I1011 10:53:27.795558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" event={"ID":"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c","Type":"ContainerStarted","Data":"0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3"} Oct 11 10:53:27.795761 master-0 kubenswrapper[4790]: I1011 10:53:27.795745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:27.798545 master-0 kubenswrapper[4790]: I1011 10:53:27.798466 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-2" event={"ID":"2d07581a-888a-4d3f-890b-550587e5657e","Type":"ContainerStarted","Data":"07126fb93a2731bc90528e8dbe62a96535875a4d288aaa47518cc37c60307378"} Oct 11 10:53:27.822062 master-0 kubenswrapper[4790]: I1011 10:53:27.821955 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" podStartSLOduration=2.82192503 podStartE2EDuration="2.82192503s" podCreationTimestamp="2025-10-11 10:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:53:27.818793534 +0000 UTC m=+884.373253826" watchObservedRunningTime="2025-10-11 10:53:27.82192503 +0000 UTC m=+884.376385332" Oct 11 10:53:28.921559 master-0 kubenswrapper[4790]: I1011 10:53:28.921518 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:29.837473 master-0 kubenswrapper[4790]: I1011 10:53:29.837330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-2" event={"ID":"2d07581a-888a-4d3f-890b-550587e5657e","Type":"ContainerStarted","Data":"3d7d2e12092061298f8a990c80b59dad5f4f26ac715667c9d48cd66a44e6d3b6"} Oct 11 10:53:29.837942 master-0 kubenswrapper[4790]: I1011 10:53:29.837734 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-2" Oct 11 10:53:29.895902 master-0 kubenswrapper[4790]: I1011 10:53:29.895797 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-2" podStartSLOduration=2.941227627 podStartE2EDuration="4.895764457s" podCreationTimestamp="2025-10-11 10:53:25 +0000 UTC" firstStartedPulling="2025-10-11 10:53:26.807556007 +0000 UTC m=+883.362016299" lastFinishedPulling="2025-10-11 10:53:28.762092827 +0000 UTC m=+885.316553129" observedRunningTime="2025-10-11 10:53:29.884583428 +0000 UTC m=+886.439043750" watchObservedRunningTime="2025-10-11 10:53:29.895764457 +0000 UTC m=+886.450224779" Oct 11 10:53:29.968226 master-0 kubenswrapper[4790]: I1011 10:53:29.968120 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:30.907056 master-0 kubenswrapper[4790]: I1011 10:53:30.906963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 11 10:53:31.309975 master-0 kubenswrapper[4790]: I1011 10:53:31.309840 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77b589666c-j6ppm"] Oct 11 10:53:31.310663 master-0 kubenswrapper[4790]: I1011 10:53:31.310217 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerName="dnsmasq-dns" containerID="cri-o://0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3" gracePeriod=10 Oct 11 10:53:31.315897 master-0 kubenswrapper[4790]: I1011 10:53:31.315852 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:31.846157 master-0 kubenswrapper[4790]: I1011 10:53:31.846094 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:31.858507 master-0 kubenswrapper[4790]: I1011 10:53:31.858437 4790 generic.go:334] "Generic (PLEG): container finished" podID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerID="0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3" exitCode=0 Oct 11 10:53:31.858663 master-0 kubenswrapper[4790]: I1011 10:53:31.858513 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" Oct 11 10:53:31.858663 master-0 kubenswrapper[4790]: I1011 10:53:31.858514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" event={"ID":"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c","Type":"ContainerDied","Data":"0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3"} Oct 11 10:53:31.858663 master-0 kubenswrapper[4790]: I1011 10:53:31.858597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b589666c-j6ppm" event={"ID":"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c","Type":"ContainerDied","Data":"b82148781df79646d5b0c5a4a653ab9892598923f30fe65b95aab44fedfae266"} Oct 11 10:53:31.858663 master-0 kubenswrapper[4790]: I1011 10:53:31.858628 4790 scope.go:117] "RemoveContainer" containerID="0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3" Oct 11 10:53:31.881934 master-0 kubenswrapper[4790]: I1011 10:53:31.880801 4790 scope.go:117] "RemoveContainer" containerID="5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720" Oct 11 10:53:31.899870 master-0 kubenswrapper[4790]: I1011 10:53:31.899836 4790 scope.go:117] "RemoveContainer" containerID="0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3" Oct 11 10:53:31.900407 master-0 kubenswrapper[4790]: E1011 10:53:31.900382 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3\": container with ID starting with 0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3 not found: ID does not exist" containerID="0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3" Oct 11 10:53:31.900477 master-0 kubenswrapper[4790]: I1011 10:53:31.900423 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3"} err="failed to get container status \"0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3\": rpc error: code = NotFound desc = could not find container \"0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3\": container with ID starting with 0cc3af213ad614ffbe7a12dc806856290a11df301f2407682b989da7fc2469d3 not found: ID does not exist" Oct 11 10:53:31.900477 master-0 kubenswrapper[4790]: I1011 10:53:31.900452 4790 scope.go:117] "RemoveContainer" containerID="5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720" Oct 11 10:53:31.901002 master-0 kubenswrapper[4790]: E1011 10:53:31.900978 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720\": container with ID starting with 5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720 not found: ID does not exist" containerID="5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720" Oct 11 10:53:31.901117 master-0 kubenswrapper[4790]: I1011 10:53:31.901092 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720"} err="failed to get container status \"5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720\": rpc error: code = NotFound desc = could not find container \"5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720\": container with ID starting with 5420918b5cf67f23160281d51d5a46679486566bda34999aa7dc361bb9c90720 not found: ID does not exist" Oct 11 10:53:31.983113 master-0 kubenswrapper[4790]: I1011 10:53:31.983030 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-config\") pod \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " Oct 11 10:53:31.983526 master-0 kubenswrapper[4790]: I1011 10:53:31.983221 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-ovsdbserver-nb\") pod \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " Oct 11 10:53:31.983526 master-0 kubenswrapper[4790]: I1011 10:53:31.983270 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-dns-svc\") pod \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " Oct 11 10:53:31.983526 master-0 kubenswrapper[4790]: I1011 10:53:31.983307 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-687pg\" (UniqueName: \"kubernetes.io/projected/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-kube-api-access-687pg\") pod \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\" (UID: \"c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c\") " Oct 11 10:53:31.988467 master-0 kubenswrapper[4790]: I1011 10:53:31.988395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-kube-api-access-687pg" (OuterVolumeSpecName: "kube-api-access-687pg") pod "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" (UID: "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c"). InnerVolumeSpecName "kube-api-access-687pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:53:32.013264 master-0 kubenswrapper[4790]: I1011 10:53:32.013186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" (UID: "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:32.013472 master-0 kubenswrapper[4790]: I1011 10:53:32.013197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" (UID: "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:32.020245 master-0 kubenswrapper[4790]: I1011 10:53:32.020163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-config" (OuterVolumeSpecName: "config") pod "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" (UID: "c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:32.085616 master-0 kubenswrapper[4790]: I1011 10:53:32.085552 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:32.085616 master-0 kubenswrapper[4790]: I1011 10:53:32.085603 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:32.085616 master-0 kubenswrapper[4790]: I1011 10:53:32.085613 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-687pg\" (UniqueName: \"kubernetes.io/projected/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-kube-api-access-687pg\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:32.085616 master-0 kubenswrapper[4790]: I1011 10:53:32.085626 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:32.197491 master-0 kubenswrapper[4790]: I1011 10:53:32.197431 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-1" Oct 11 10:53:32.197662 master-0 kubenswrapper[4790]: I1011 10:53:32.197527 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-1" Oct 11 10:53:32.224477 master-0 kubenswrapper[4790]: I1011 10:53:32.224144 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77b589666c-j6ppm"] Oct 11 10:53:32.241814 master-0 kubenswrapper[4790]: I1011 10:53:32.241225 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77b589666c-j6ppm"] Oct 11 10:53:32.304306 master-0 kubenswrapper[4790]: I1011 10:53:32.304138 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" path="/var/lib/kubelet/pods/c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c/volumes" Oct 11 10:53:34.316557 master-0 kubenswrapper[4790]: I1011 10:53:34.316494 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:34.367478 master-0 kubenswrapper[4790]: I1011 10:53:34.367379 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-2" Oct 11 10:53:36.344298 master-0 kubenswrapper[4790]: I1011 10:53:36.344211 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-2" Oct 11 10:53:41.504904 master-0 kubenswrapper[4790]: I1011 10:53:41.504782 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-52t2l" podUID="8c164a4b-a2d5-4570-aed3-86dbb1f3d47c" containerName="ovn-controller" probeResult="failure" output=< Oct 11 10:53:41.504904 master-0 kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 11 10:53:41.504904 master-0 kubenswrapper[4790]: > Oct 11 10:53:42.134828 master-0 kubenswrapper[4790]: I1011 10:53:42.134764 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c99f4877f-dv8jt"] Oct 11 10:53:42.135229 master-0 kubenswrapper[4790]: E1011 10:53:42.135180 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerName="init" Oct 11 10:53:42.135229 master-0 kubenswrapper[4790]: I1011 10:53:42.135196 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerName="init" Oct 11 10:53:42.135229 master-0 kubenswrapper[4790]: E1011 10:53:42.135224 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerName="dnsmasq-dns" Oct 11 10:53:42.135229 master-0 kubenswrapper[4790]: I1011 10:53:42.135230 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerName="dnsmasq-dns" Oct 11 10:53:42.135377 master-0 kubenswrapper[4790]: I1011 10:53:42.135369 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fc4e6a-fc2f-4ed0-a753-f5a54f83777c" containerName="dnsmasq-dns" Oct 11 10:53:42.136478 master-0 kubenswrapper[4790]: I1011 10:53:42.136420 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.140027 master-0 kubenswrapper[4790]: I1011 10:53:42.139973 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 10:53:42.140298 master-0 kubenswrapper[4790]: I1011 10:53:42.140255 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:53:42.140463 master-0 kubenswrapper[4790]: I1011 10:53:42.140440 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:53:42.140736 master-0 kubenswrapper[4790]: I1011 10:53:42.140689 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 10:53:42.159791 master-0 kubenswrapper[4790]: I1011 10:53:42.157226 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c99f4877f-dv8jt"] Oct 11 10:53:42.181162 master-0 kubenswrapper[4790]: I1011 10:53:42.181039 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-nb\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.181503 master-0 kubenswrapper[4790]: I1011 10:53:42.181300 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-sb\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.181503 master-0 kubenswrapper[4790]: I1011 10:53:42.181372 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-dns-svc\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.181503 master-0 kubenswrapper[4790]: I1011 10:53:42.181451 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjqms\" (UniqueName: \"kubernetes.io/projected/6a4dc537-c4a3-4538-887f-62fe3919d5f0-kube-api-access-kjqms\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.181620 master-0 kubenswrapper[4790]: I1011 10:53:42.181557 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-config\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.283281 master-0 kubenswrapper[4790]: I1011 10:53:42.283214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjqms\" (UniqueName: \"kubernetes.io/projected/6a4dc537-c4a3-4538-887f-62fe3919d5f0-kube-api-access-kjqms\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.283281 master-0 kubenswrapper[4790]: I1011 10:53:42.283302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-config\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.283628 master-0 kubenswrapper[4790]: I1011 10:53:42.283353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-nb\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.283628 master-0 kubenswrapper[4790]: I1011 10:53:42.283393 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-sb\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.283628 master-0 kubenswrapper[4790]: I1011 10:53:42.283422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-dns-svc\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.284962 master-0 kubenswrapper[4790]: I1011 10:53:42.284842 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-dns-svc\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.284962 master-0 kubenswrapper[4790]: I1011 10:53:42.284840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-config\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.285072 master-0 kubenswrapper[4790]: I1011 10:53:42.284984 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-sb\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.286574 master-0 kubenswrapper[4790]: I1011 10:53:42.286528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-nb\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.305842 master-0 kubenswrapper[4790]: I1011 10:53:42.305787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjqms\" (UniqueName: \"kubernetes.io/projected/6a4dc537-c4a3-4538-887f-62fe3919d5f0-kube-api-access-kjqms\") pod \"dnsmasq-dns-6c99f4877f-dv8jt\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.470929 master-0 kubenswrapper[4790]: I1011 10:53:42.470218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:42.961768 master-0 kubenswrapper[4790]: I1011 10:53:42.961671 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c99f4877f-dv8jt"] Oct 11 10:53:42.971901 master-0 kubenswrapper[4790]: W1011 10:53:42.971433 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a4dc537_c4a3_4538_887f_62fe3919d5f0.slice/crio-ff20f3c33adc8038a2f30426436b1b65b746807b9fb9fe4c5e10d86eebcbb3ee WatchSource:0}: Error finding container ff20f3c33adc8038a2f30426436b1b65b746807b9fb9fe4c5e10d86eebcbb3ee: Status 404 returned error can't find the container with id ff20f3c33adc8038a2f30426436b1b65b746807b9fb9fe4c5e10d86eebcbb3ee Oct 11 10:53:43.979013 master-0 kubenswrapper[4790]: I1011 10:53:43.978951 4790 generic.go:334] "Generic (PLEG): container finished" podID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerID="42923cd7993a370d966eb589a3c5dfe41bcbc3a27770fa8b1538dbc31e8e9a97" exitCode=0 Oct 11 10:53:43.979676 master-0 kubenswrapper[4790]: I1011 10:53:43.979030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" event={"ID":"6a4dc537-c4a3-4538-887f-62fe3919d5f0","Type":"ContainerDied","Data":"42923cd7993a370d966eb589a3c5dfe41bcbc3a27770fa8b1538dbc31e8e9a97"} Oct 11 10:53:43.979676 master-0 kubenswrapper[4790]: I1011 10:53:43.979113 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" event={"ID":"6a4dc537-c4a3-4538-887f-62fe3919d5f0","Type":"ContainerStarted","Data":"ff20f3c33adc8038a2f30426436b1b65b746807b9fb9fe4c5e10d86eebcbb3ee"} Oct 11 10:53:44.135630 master-0 kubenswrapper[4790]: I1011 10:53:44.135301 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 11 10:53:44.145620 master-0 kubenswrapper[4790]: I1011 10:53:44.145460 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 11 10:53:44.190846 master-0 kubenswrapper[4790]: I1011 10:53:44.190776 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 11 10:53:44.191685 master-0 kubenswrapper[4790]: I1011 10:53:44.191636 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 11 10:53:44.192134 master-0 kubenswrapper[4790]: I1011 10:53:44.192100 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 11 10:53:44.192305 master-0 kubenswrapper[4790]: I1011 10:53:44.192279 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 11 10:53:44.226314 master-0 kubenswrapper[4790]: I1011 10:53:44.226265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-1" Oct 11 10:53:44.237674 master-0 kubenswrapper[4790]: I1011 10:53:44.237404 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlnr\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-kube-api-access-lrlnr\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.237674 master-0 kubenswrapper[4790]: I1011 10:53:44.237538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-lock\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.237674 master-0 kubenswrapper[4790]: I1011 10:53:44.237676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.263830 master-0 kubenswrapper[4790]: I1011 10:53:44.263315 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-cache\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.264093 master-0 kubenswrapper[4790]: I1011 10:53:44.263676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fec5bf36-c7f7-4a22-93a7-214a8670a9dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b422b68-2a82-4e2e-ab23-eb4dc08f75f3\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.312791 master-0 kubenswrapper[4790]: I1011 10:53:44.312721 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-1" Oct 11 10:53:44.369204 master-0 kubenswrapper[4790]: I1011 10:53:44.368792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.369204 master-0 kubenswrapper[4790]: I1011 10:53:44.368966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-cache\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.369204 master-0 kubenswrapper[4790]: I1011 10:53:44.368995 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fec5bf36-c7f7-4a22-93a7-214a8670a9dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b422b68-2a82-4e2e-ab23-eb4dc08f75f3\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.369204 master-0 kubenswrapper[4790]: I1011 10:53:44.369023 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrlnr\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-kube-api-access-lrlnr\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.369204 master-0 kubenswrapper[4790]: I1011 10:53:44.369038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-lock\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.369591 master-0 kubenswrapper[4790]: I1011 10:53:44.369562 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-lock\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.369917 master-0 kubenswrapper[4790]: E1011 10:53:44.369880 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 10:53:44.369971 master-0 kubenswrapper[4790]: E1011 10:53:44.369918 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 10:53:44.370008 master-0 kubenswrapper[4790]: E1011 10:53:44.369974 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift podName:c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27 nodeName:}" failed. No retries permitted until 2025-10-11 10:53:44.869948229 +0000 UTC m=+901.424408521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift") pod "swift-storage-0" (UID: "c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27") : configmap "swift-ring-files" not found Oct 11 10:53:44.377242 master-0 kubenswrapper[4790]: I1011 10:53:44.377190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-cache\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.382346 master-0 kubenswrapper[4790]: I1011 10:53:44.379039 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:53:44.382346 master-0 kubenswrapper[4790]: I1011 10:53:44.379109 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fec5bf36-c7f7-4a22-93a7-214a8670a9dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b422b68-2a82-4e2e-ab23-eb4dc08f75f3\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/00bcb7e430e97a6ecb5450e981c0e90813b177e374e1633380e36c3e3673697d/globalmount\"" pod="openstack/swift-storage-0" Oct 11 10:53:44.423402 master-0 kubenswrapper[4790]: I1011 10:53:44.423330 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrlnr\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-kube-api-access-lrlnr\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.878275 master-0 kubenswrapper[4790]: I1011 10:53:44.878198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:44.878554 master-0 kubenswrapper[4790]: E1011 10:53:44.878413 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 10:53:44.878554 master-0 kubenswrapper[4790]: E1011 10:53:44.878440 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 10:53:44.878554 master-0 kubenswrapper[4790]: E1011 10:53:44.878513 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift podName:c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27 nodeName:}" failed. No retries permitted until 2025-10-11 10:53:45.878487804 +0000 UTC m=+902.432948096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift") pod "swift-storage-0" (UID: "c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27") : configmap "swift-ring-files" not found Oct 11 10:53:44.987848 master-0 kubenswrapper[4790]: I1011 10:53:44.987762 4790 generic.go:334] "Generic (PLEG): container finished" podID="be929908-6474-451d-8b87-e4effd7c6de4" containerID="ec0350a7355f6c5da814acb3e4b50a985a39478e4987d701f3bdf168b3a6530a" exitCode=0 Oct 11 10:53:44.988407 master-0 kubenswrapper[4790]: I1011 10:53:44.987888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"be929908-6474-451d-8b87-e4effd7c6de4","Type":"ContainerDied","Data":"ec0350a7355f6c5da814acb3e4b50a985a39478e4987d701f3bdf168b3a6530a"} Oct 11 10:53:44.990074 master-0 kubenswrapper[4790]: I1011 10:53:44.990005 4790 generic.go:334] "Generic (PLEG): container finished" podID="c13cb0d1-c50f-44fa-824a-46ece423a7cc" containerID="c4be72b5de1183ec2ba8473fed3fe3c9aa390d6a7ab90458f3ecfc26bf72839f" exitCode=0 Oct 11 10:53:44.990137 master-0 kubenswrapper[4790]: I1011 10:53:44.990078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c13cb0d1-c50f-44fa-824a-46ece423a7cc","Type":"ContainerDied","Data":"c4be72b5de1183ec2ba8473fed3fe3c9aa390d6a7ab90458f3ecfc26bf72839f"} Oct 11 10:53:44.992944 master-0 kubenswrapper[4790]: I1011 10:53:44.992874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" event={"ID":"6a4dc537-c4a3-4538-887f-62fe3919d5f0","Type":"ContainerStarted","Data":"381e97f0118ecd0a55897d40739e39a7fe13db1d358b9b0a4b193c8f784e5a52"} Oct 11 10:53:44.993075 master-0 kubenswrapper[4790]: I1011 10:53:44.993007 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:45.099804 master-0 kubenswrapper[4790]: I1011 10:53:45.099665 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" podStartSLOduration=3.099640413 podStartE2EDuration="3.099640413s" podCreationTimestamp="2025-10-11 10:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:53:45.098379168 +0000 UTC m=+901.652839460" watchObservedRunningTime="2025-10-11 10:53:45.099640413 +0000 UTC m=+901.654100705" Oct 11 10:53:45.160468 master-0 kubenswrapper[4790]: I1011 10:53:45.155168 4790 trace.go:236] Trace[1484285432]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-1" (11-Oct-2025 10:53:44.136) (total time: 1018ms): Oct 11 10:53:45.160468 master-0 kubenswrapper[4790]: Trace[1484285432]: [1.018910799s] [1.018910799s] END Oct 11 10:53:45.782912 master-0 kubenswrapper[4790]: I1011 10:53:45.782810 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-2" podUID="5059e0b0-120f-4498-8076-e3e9239b5688" containerName="galera" probeResult="failure" output=< Oct 11 10:53:45.782912 master-0 kubenswrapper[4790]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Oct 11 10:53:45.782912 master-0 kubenswrapper[4790]: > Oct 11 10:53:45.897408 master-0 kubenswrapper[4790]: I1011 10:53:45.897337 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:45.897672 master-0 kubenswrapper[4790]: E1011 10:53:45.897576 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 10:53:45.897672 master-0 kubenswrapper[4790]: E1011 10:53:45.897598 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 10:53:45.897672 master-0 kubenswrapper[4790]: E1011 10:53:45.897657 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift podName:c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27 nodeName:}" failed. No retries permitted until 2025-10-11 10:53:47.89763736 +0000 UTC m=+904.452097652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift") pod "swift-storage-0" (UID: "c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27") : configmap "swift-ring-files" not found Oct 11 10:53:46.016386 master-0 kubenswrapper[4790]: I1011 10:53:46.016319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"be929908-6474-451d-8b87-e4effd7c6de4","Type":"ContainerStarted","Data":"6eeb4898825ff63a64166d6c91cc65fe503c754c39bb17b8a91974c755363ef9"} Oct 11 10:53:46.017482 master-0 kubenswrapper[4790]: I1011 10:53:46.017404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:53:46.021280 master-0 kubenswrapper[4790]: I1011 10:53:46.021239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c13cb0d1-c50f-44fa-824a-46ece423a7cc","Type":"ContainerStarted","Data":"8c4e3c9ea6d44bef7da8dfa27820fa539dc9c608507aa54b405dc8481ca9a2de"} Oct 11 10:53:46.021520 master-0 kubenswrapper[4790]: I1011 10:53:46.021481 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Oct 11 10:53:46.058124 master-0 kubenswrapper[4790]: I1011 10:53:46.057980 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-1" podStartSLOduration=55.057919572 podStartE2EDuration="55.057919572s" podCreationTimestamp="2025-10-11 10:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:53:46.051134583 +0000 UTC m=+902.605594885" watchObservedRunningTime="2025-10-11 10:53:46.057919572 +0000 UTC m=+902.612379874" Oct 11 10:53:46.096876 master-0 kubenswrapper[4790]: I1011 10:53:46.096775 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=48.004478761 podStartE2EDuration="56.096745628s" podCreationTimestamp="2025-10-11 10:52:50 +0000 UTC" firstStartedPulling="2025-10-11 10:53:02.701315784 +0000 UTC m=+859.255776076" lastFinishedPulling="2025-10-11 10:53:10.793582661 +0000 UTC m=+867.348042943" observedRunningTime="2025-10-11 10:53:46.091697328 +0000 UTC m=+902.646157640" watchObservedRunningTime="2025-10-11 10:53:46.096745628 +0000 UTC m=+902.651206160" Oct 11 10:53:46.496375 master-0 kubenswrapper[4790]: I1011 10:53:46.495561 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-52t2l" podUID="8c164a4b-a2d5-4570-aed3-86dbb1f3d47c" containerName="ovn-controller" probeResult="failure" output=< Oct 11 10:53:46.496375 master-0 kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 11 10:53:46.496375 master-0 kubenswrapper[4790]: > Oct 11 10:53:46.523664 master-0 kubenswrapper[4790]: I1011 10:53:46.523536 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fec5bf36-c7f7-4a22-93a7-214a8670a9dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b422b68-2a82-4e2e-ab23-eb4dc08f75f3\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:46.599640 master-0 kubenswrapper[4790]: I1011 10:53:46.599181 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:46.623270 master-0 kubenswrapper[4790]: I1011 10:53:46.623200 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dw8wx" Oct 11 10:53:47.935109 master-0 kubenswrapper[4790]: I1011 10:53:47.935020 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:47.939960 master-0 kubenswrapper[4790]: E1011 10:53:47.935326 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 10:53:47.939960 master-0 kubenswrapper[4790]: E1011 10:53:47.935379 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 10:53:47.939960 master-0 kubenswrapper[4790]: E1011 10:53:47.935458 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift podName:c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27 nodeName:}" failed. No retries permitted until 2025-10-11 10:53:51.935431017 +0000 UTC m=+908.489891299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift") pod "swift-storage-0" (UID: "c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27") : configmap "swift-ring-files" not found Oct 11 10:53:48.968460 master-0 kubenswrapper[4790]: I1011 10:53:48.968382 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-113b-account-create-twjxb"] Oct 11 10:53:48.970157 master-0 kubenswrapper[4790]: I1011 10:53:48.970119 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:48.973585 master-0 kubenswrapper[4790]: I1011 10:53:48.973542 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 11 10:53:48.979584 master-0 kubenswrapper[4790]: I1011 10:53:48.979521 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-113b-account-create-twjxb"] Oct 11 10:53:49.050784 master-0 kubenswrapper[4790]: I1011 10:53:49.050687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcmd\" (UniqueName: \"kubernetes.io/projected/70cbbe93-7c50-40cb-91f4-f75c8875580d-kube-api-access-hxcmd\") pod \"glance-113b-account-create-twjxb\" (UID: \"70cbbe93-7c50-40cb-91f4-f75c8875580d\") " pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:49.153683 master-0 kubenswrapper[4790]: I1011 10:53:49.153584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcmd\" (UniqueName: \"kubernetes.io/projected/70cbbe93-7c50-40cb-91f4-f75c8875580d-kube-api-access-hxcmd\") pod \"glance-113b-account-create-twjxb\" (UID: \"70cbbe93-7c50-40cb-91f4-f75c8875580d\") " pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:49.177485 master-0 kubenswrapper[4790]: I1011 10:53:49.177415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcmd\" (UniqueName: \"kubernetes.io/projected/70cbbe93-7c50-40cb-91f4-f75c8875580d-kube-api-access-hxcmd\") pod \"glance-113b-account-create-twjxb\" (UID: \"70cbbe93-7c50-40cb-91f4-f75c8875580d\") " pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:49.322640 master-0 kubenswrapper[4790]: I1011 10:53:49.322561 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:49.762853 master-0 kubenswrapper[4790]: I1011 10:53:49.761370 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-113b-account-create-twjxb"] Oct 11 10:53:49.768729 master-0 kubenswrapper[4790]: W1011 10:53:49.767428 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70cbbe93_7c50_40cb_91f4_f75c8875580d.slice/crio-3c49cfcc6699f3b2309e444c646f6f84fab947b9d15c58ff2909f3cf16e3437e WatchSource:0}: Error finding container 3c49cfcc6699f3b2309e444c646f6f84fab947b9d15c58ff2909f3cf16e3437e: Status 404 returned error can't find the container with id 3c49cfcc6699f3b2309e444c646f6f84fab947b9d15c58ff2909f3cf16e3437e Oct 11 10:53:50.057181 master-0 kubenswrapper[4790]: I1011 10:53:50.057050 4790 generic.go:334] "Generic (PLEG): container finished" podID="70cbbe93-7c50-40cb-91f4-f75c8875580d" containerID="1cde782f190214155e020d24bbe2e2d5c9f2dc24b3fea8e9236ee944da092a1c" exitCode=0 Oct 11 10:53:50.057181 master-0 kubenswrapper[4790]: I1011 10:53:50.057133 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-113b-account-create-twjxb" event={"ID":"70cbbe93-7c50-40cb-91f4-f75c8875580d","Type":"ContainerDied","Data":"1cde782f190214155e020d24bbe2e2d5c9f2dc24b3fea8e9236ee944da092a1c"} Oct 11 10:53:50.057852 master-0 kubenswrapper[4790]: I1011 10:53:50.057296 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-113b-account-create-twjxb" event={"ID":"70cbbe93-7c50-40cb-91f4-f75c8875580d","Type":"ContainerStarted","Data":"3c49cfcc6699f3b2309e444c646f6f84fab947b9d15c58ff2909f3cf16e3437e"} Oct 11 10:53:50.265638 master-0 kubenswrapper[4790]: I1011 10:53:50.265570 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-52t2l-config-8jkrq"] Oct 11 10:53:50.266906 master-0 kubenswrapper[4790]: I1011 10:53:50.266876 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.270180 master-0 kubenswrapper[4790]: I1011 10:53:50.270046 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 11 10:53:50.283306 master-0 kubenswrapper[4790]: I1011 10:53:50.283236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.283520 master-0 kubenswrapper[4790]: I1011 10:53:50.283327 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-additional-scripts\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.283520 master-0 kubenswrapper[4790]: I1011 10:53:50.283403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run-ovn\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.283520 master-0 kubenswrapper[4790]: I1011 10:53:50.283487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-scripts\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.283520 master-0 kubenswrapper[4790]: I1011 10:53:50.283489 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-52t2l-config-8jkrq"] Oct 11 10:53:50.283914 master-0 kubenswrapper[4790]: I1011 10:53:50.283849 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-log-ovn\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.283988 master-0 kubenswrapper[4790]: I1011 10:53:50.283953 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwqg\" (UniqueName: \"kubernetes.io/projected/c874e686-cbb2-4de3-8058-382a74b5742d-kube-api-access-6nwqg\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386476 master-0 kubenswrapper[4790]: I1011 10:53:50.386362 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386476 master-0 kubenswrapper[4790]: I1011 10:53:50.386491 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-additional-scripts\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386833 master-0 kubenswrapper[4790]: I1011 10:53:50.386526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run-ovn\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386833 master-0 kubenswrapper[4790]: I1011 10:53:50.386569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-scripts\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386833 master-0 kubenswrapper[4790]: I1011 10:53:50.386603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-log-ovn\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386833 master-0 kubenswrapper[4790]: I1011 10:53:50.386638 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwqg\" (UniqueName: \"kubernetes.io/projected/c874e686-cbb2-4de3-8058-382a74b5742d-kube-api-access-6nwqg\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.386833 master-0 kubenswrapper[4790]: I1011 10:53:50.386652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.387094 master-0 kubenswrapper[4790]: I1011 10:53:50.387014 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run-ovn\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.387374 master-0 kubenswrapper[4790]: I1011 10:53:50.387335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-log-ovn\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.387896 master-0 kubenswrapper[4790]: I1011 10:53:50.387864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-additional-scripts\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.391856 master-0 kubenswrapper[4790]: I1011 10:53:50.391811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-scripts\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.421358 master-0 kubenswrapper[4790]: I1011 10:53:50.421309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwqg\" (UniqueName: \"kubernetes.io/projected/c874e686-cbb2-4de3-8058-382a74b5742d-kube-api-access-6nwqg\") pod \"ovn-controller-52t2l-config-8jkrq\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:50.638126 master-0 kubenswrapper[4790]: I1011 10:53:50.638038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:51.125668 master-0 kubenswrapper[4790]: W1011 10:53:51.125598 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc874e686_cbb2_4de3_8058_382a74b5742d.slice/crio-4d1096b4dae647b1b4423270ddfe174d7a5e98b7791d42acd49047577bd6f195 WatchSource:0}: Error finding container 4d1096b4dae647b1b4423270ddfe174d7a5e98b7791d42acd49047577bd6f195: Status 404 returned error can't find the container with id 4d1096b4dae647b1b4423270ddfe174d7a5e98b7791d42acd49047577bd6f195 Oct 11 10:53:51.126188 master-0 kubenswrapper[4790]: I1011 10:53:51.126087 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-52t2l-config-8jkrq"] Oct 11 10:53:51.368520 master-0 kubenswrapper[4790]: I1011 10:53:51.368472 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:51.437127 master-0 kubenswrapper[4790]: I1011 10:53:51.437057 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxcmd\" (UniqueName: \"kubernetes.io/projected/70cbbe93-7c50-40cb-91f4-f75c8875580d-kube-api-access-hxcmd\") pod \"70cbbe93-7c50-40cb-91f4-f75c8875580d\" (UID: \"70cbbe93-7c50-40cb-91f4-f75c8875580d\") " Oct 11 10:53:51.442813 master-0 kubenswrapper[4790]: I1011 10:53:51.441296 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70cbbe93-7c50-40cb-91f4-f75c8875580d-kube-api-access-hxcmd" (OuterVolumeSpecName: "kube-api-access-hxcmd") pod "70cbbe93-7c50-40cb-91f4-f75c8875580d" (UID: "70cbbe93-7c50-40cb-91f4-f75c8875580d"). InnerVolumeSpecName "kube-api-access-hxcmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:53:51.497990 master-0 kubenswrapper[4790]: I1011 10:53:51.496950 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-52t2l" Oct 11 10:53:51.541396 master-0 kubenswrapper[4790]: I1011 10:53:51.541338 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxcmd\" (UniqueName: \"kubernetes.io/projected/70cbbe93-7c50-40cb-91f4-f75c8875580d-kube-api-access-hxcmd\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:51.948520 master-0 kubenswrapper[4790]: I1011 10:53:51.948356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:53:51.948745 master-0 kubenswrapper[4790]: E1011 10:53:51.948600 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 10:53:51.948745 master-0 kubenswrapper[4790]: E1011 10:53:51.948627 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 10:53:51.948745 master-0 kubenswrapper[4790]: E1011 10:53:51.948725 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift podName:c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27 nodeName:}" failed. No retries permitted until 2025-10-11 10:53:59.948680924 +0000 UTC m=+916.503141216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift") pod "swift-storage-0" (UID: "c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27") : configmap "swift-ring-files" not found Oct 11 10:53:52.073931 master-0 kubenswrapper[4790]: I1011 10:53:52.073868 4790 generic.go:334] "Generic (PLEG): container finished" podID="c874e686-cbb2-4de3-8058-382a74b5742d" containerID="cc0e410018cdbb38cb0a44455ce0c9bcffaa24fb5b85e7a4f71ece632724bed8" exitCode=0 Oct 11 10:53:52.074212 master-0 kubenswrapper[4790]: I1011 10:53:52.073987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-52t2l-config-8jkrq" event={"ID":"c874e686-cbb2-4de3-8058-382a74b5742d","Type":"ContainerDied","Data":"cc0e410018cdbb38cb0a44455ce0c9bcffaa24fb5b85e7a4f71ece632724bed8"} Oct 11 10:53:52.074212 master-0 kubenswrapper[4790]: I1011 10:53:52.074095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-52t2l-config-8jkrq" event={"ID":"c874e686-cbb2-4de3-8058-382a74b5742d","Type":"ContainerStarted","Data":"4d1096b4dae647b1b4423270ddfe174d7a5e98b7791d42acd49047577bd6f195"} Oct 11 10:53:52.076228 master-0 kubenswrapper[4790]: I1011 10:53:52.076187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-113b-account-create-twjxb" event={"ID":"70cbbe93-7c50-40cb-91f4-f75c8875580d","Type":"ContainerDied","Data":"3c49cfcc6699f3b2309e444c646f6f84fab947b9d15c58ff2909f3cf16e3437e"} Oct 11 10:53:52.076228 master-0 kubenswrapper[4790]: I1011 10:53:52.076216 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c49cfcc6699f3b2309e444c646f6f84fab947b9d15c58ff2909f3cf16e3437e" Oct 11 10:53:52.076366 master-0 kubenswrapper[4790]: I1011 10:53:52.076327 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-113b-account-create-twjxb" Oct 11 10:53:52.474040 master-0 kubenswrapper[4790]: I1011 10:53:52.473918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:53:53.502700 master-0 kubenswrapper[4790]: I1011 10:53:53.502633 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:53.580188 master-0 kubenswrapper[4790]: I1011 10:53:53.580090 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nwqg\" (UniqueName: \"kubernetes.io/projected/c874e686-cbb2-4de3-8058-382a74b5742d-kube-api-access-6nwqg\") pod \"c874e686-cbb2-4de3-8058-382a74b5742d\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " Oct 11 10:53:53.580506 master-0 kubenswrapper[4790]: I1011 10:53:53.580228 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-additional-scripts\") pod \"c874e686-cbb2-4de3-8058-382a74b5742d\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " Oct 11 10:53:53.580506 master-0 kubenswrapper[4790]: I1011 10:53:53.580272 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run\") pod \"c874e686-cbb2-4de3-8058-382a74b5742d\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " Oct 11 10:53:53.580506 master-0 kubenswrapper[4790]: I1011 10:53:53.580299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-log-ovn\") pod \"c874e686-cbb2-4de3-8058-382a74b5742d\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " Oct 11 10:53:53.580506 master-0 kubenswrapper[4790]: I1011 10:53:53.580450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-scripts\") pod \"c874e686-cbb2-4de3-8058-382a74b5742d\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " Oct 11 10:53:53.580506 master-0 kubenswrapper[4790]: I1011 10:53:53.580480 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run-ovn\") pod \"c874e686-cbb2-4de3-8058-382a74b5742d\" (UID: \"c874e686-cbb2-4de3-8058-382a74b5742d\") " Oct 11 10:53:53.580920 master-0 kubenswrapper[4790]: I1011 10:53:53.580857 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run" (OuterVolumeSpecName: "var-run") pod "c874e686-cbb2-4de3-8058-382a74b5742d" (UID: "c874e686-cbb2-4de3-8058-382a74b5742d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:53:53.580986 master-0 kubenswrapper[4790]: I1011 10:53:53.580959 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c874e686-cbb2-4de3-8058-382a74b5742d" (UID: "c874e686-cbb2-4de3-8058-382a74b5742d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:53:53.581022 master-0 kubenswrapper[4790]: I1011 10:53:53.581007 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c874e686-cbb2-4de3-8058-382a74b5742d" (UID: "c874e686-cbb2-4de3-8058-382a74b5742d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:53:53.581695 master-0 kubenswrapper[4790]: I1011 10:53:53.581662 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c874e686-cbb2-4de3-8058-382a74b5742d" (UID: "c874e686-cbb2-4de3-8058-382a74b5742d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:53.581988 master-0 kubenswrapper[4790]: I1011 10:53:53.581947 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-scripts" (OuterVolumeSpecName: "scripts") pod "c874e686-cbb2-4de3-8058-382a74b5742d" (UID: "c874e686-cbb2-4de3-8058-382a74b5742d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:53:53.584841 master-0 kubenswrapper[4790]: I1011 10:53:53.584796 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c874e686-cbb2-4de3-8058-382a74b5742d-kube-api-access-6nwqg" (OuterVolumeSpecName: "kube-api-access-6nwqg") pod "c874e686-cbb2-4de3-8058-382a74b5742d" (UID: "c874e686-cbb2-4de3-8058-382a74b5742d"). InnerVolumeSpecName "kube-api-access-6nwqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:53:53.682304 master-0 kubenswrapper[4790]: I1011 10:53:53.682149 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:53.682304 master-0 kubenswrapper[4790]: I1011 10:53:53.682190 4790 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:53.682304 master-0 kubenswrapper[4790]: I1011 10:53:53.682201 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nwqg\" (UniqueName: \"kubernetes.io/projected/c874e686-cbb2-4de3-8058-382a74b5742d-kube-api-access-6nwqg\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:53.682304 master-0 kubenswrapper[4790]: I1011 10:53:53.682211 4790 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c874e686-cbb2-4de3-8058-382a74b5742d-additional-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:53.682304 master-0 kubenswrapper[4790]: I1011 10:53:53.682223 4790 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-run\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:53.682304 master-0 kubenswrapper[4790]: I1011 10:53:53.682233 4790 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c874e686-cbb2-4de3-8058-382a74b5742d-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:54.100758 master-0 kubenswrapper[4790]: I1011 10:53:54.100505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-52t2l-config-8jkrq" event={"ID":"c874e686-cbb2-4de3-8058-382a74b5742d","Type":"ContainerDied","Data":"4d1096b4dae647b1b4423270ddfe174d7a5e98b7791d42acd49047577bd6f195"} Oct 11 10:53:54.100758 master-0 kubenswrapper[4790]: I1011 10:53:54.100622 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1096b4dae647b1b4423270ddfe174d7a5e98b7791d42acd49047577bd6f195" Oct 11 10:53:54.100758 master-0 kubenswrapper[4790]: I1011 10:53:54.100657 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-52t2l-config-8jkrq" Oct 11 10:53:54.622736 master-0 kubenswrapper[4790]: I1011 10:53:54.621820 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-52t2l-config-8jkrq"] Oct 11 10:53:54.627211 master-0 kubenswrapper[4790]: I1011 10:53:54.627145 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-52t2l-config-8jkrq"] Oct 11 10:53:55.193497 master-0 kubenswrapper[4790]: I1011 10:53:55.193395 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0a7a-account-create-9c44k"] Oct 11 10:53:55.193904 master-0 kubenswrapper[4790]: E1011 10:53:55.193876 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874e686-cbb2-4de3-8058-382a74b5742d" containerName="ovn-config" Oct 11 10:53:55.193962 master-0 kubenswrapper[4790]: I1011 10:53:55.193911 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874e686-cbb2-4de3-8058-382a74b5742d" containerName="ovn-config" Oct 11 10:53:55.193962 master-0 kubenswrapper[4790]: E1011 10:53:55.193940 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70cbbe93-7c50-40cb-91f4-f75c8875580d" containerName="mariadb-account-create" Oct 11 10:53:55.193962 master-0 kubenswrapper[4790]: I1011 10:53:55.193950 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="70cbbe93-7c50-40cb-91f4-f75c8875580d" containerName="mariadb-account-create" Oct 11 10:53:55.194154 master-0 kubenswrapper[4790]: I1011 10:53:55.194133 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="70cbbe93-7c50-40cb-91f4-f75c8875580d" containerName="mariadb-account-create" Oct 11 10:53:55.194197 master-0 kubenswrapper[4790]: I1011 10:53:55.194161 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874e686-cbb2-4de3-8058-382a74b5742d" containerName="ovn-config" Oct 11 10:53:55.196148 master-0 kubenswrapper[4790]: I1011 10:53:55.196109 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:55.200535 master-0 kubenswrapper[4790]: I1011 10:53:55.200278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 11 10:53:55.215154 master-0 kubenswrapper[4790]: I1011 10:53:55.215033 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a7a-account-create-9c44k"] Oct 11 10:53:55.316667 master-0 kubenswrapper[4790]: I1011 10:53:55.316504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmddg\" (UniqueName: \"kubernetes.io/projected/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5-kube-api-access-dmddg\") pod \"keystone-0a7a-account-create-9c44k\" (UID: \"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5\") " pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:55.418215 master-0 kubenswrapper[4790]: I1011 10:53:55.418103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmddg\" (UniqueName: \"kubernetes.io/projected/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5-kube-api-access-dmddg\") pod \"keystone-0a7a-account-create-9c44k\" (UID: \"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5\") " pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:55.446297 master-0 kubenswrapper[4790]: I1011 10:53:55.446124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmddg\" (UniqueName: \"kubernetes.io/projected/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5-kube-api-access-dmddg\") pod \"keystone-0a7a-account-create-9c44k\" (UID: \"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5\") " pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:55.512385 master-0 kubenswrapper[4790]: I1011 10:53:55.512294 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-1" podUID="be929908-6474-451d-8b87-e4effd7c6de4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.130.0.67:5671: connect: connection refused" Oct 11 10:53:55.515778 master-0 kubenswrapper[4790]: I1011 10:53:55.515721 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:55.643815 master-0 kubenswrapper[4790]: I1011 10:53:55.640607 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-af51-account-create-tz8f4"] Oct 11 10:53:55.643815 master-0 kubenswrapper[4790]: I1011 10:53:55.642542 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-af51-account-create-tz8f4"] Oct 11 10:53:55.643815 master-0 kubenswrapper[4790]: I1011 10:53:55.642783 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:55.655746 master-0 kubenswrapper[4790]: I1011 10:53:55.655217 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 11 10:53:55.724008 master-0 kubenswrapper[4790]: I1011 10:53:55.723855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldn48\" (UniqueName: \"kubernetes.io/projected/4838cae2-31c3-4b4d-a914-e95b0b6308be-kube-api-access-ldn48\") pod \"placement-af51-account-create-tz8f4\" (UID: \"4838cae2-31c3-4b4d-a914-e95b0b6308be\") " pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:55.826430 master-0 kubenswrapper[4790]: I1011 10:53:55.826326 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldn48\" (UniqueName: \"kubernetes.io/projected/4838cae2-31c3-4b4d-a914-e95b0b6308be-kube-api-access-ldn48\") pod \"placement-af51-account-create-tz8f4\" (UID: \"4838cae2-31c3-4b4d-a914-e95b0b6308be\") " pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:55.846437 master-0 kubenswrapper[4790]: I1011 10:53:55.846366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldn48\" (UniqueName: \"kubernetes.io/projected/4838cae2-31c3-4b4d-a914-e95b0b6308be-kube-api-access-ldn48\") pod \"placement-af51-account-create-tz8f4\" (UID: \"4838cae2-31c3-4b4d-a914-e95b0b6308be\") " pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:55.956891 master-0 kubenswrapper[4790]: I1011 10:53:55.956802 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a7a-account-create-9c44k"] Oct 11 10:53:55.959070 master-0 kubenswrapper[4790]: W1011 10:53:55.959002 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc21954bc_9fb3_4d4e_8085_b2fcf628e0a5.slice/crio-e7aabdd1f9e81707ceac24468611fbf243bb7fbd9a16bb4d480bb8af844a4e45 WatchSource:0}: Error finding container e7aabdd1f9e81707ceac24468611fbf243bb7fbd9a16bb4d480bb8af844a4e45: Status 404 returned error can't find the container with id e7aabdd1f9e81707ceac24468611fbf243bb7fbd9a16bb4d480bb8af844a4e45 Oct 11 10:53:55.975414 master-0 kubenswrapper[4790]: I1011 10:53:55.975247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:56.128670 master-0 kubenswrapper[4790]: I1011 10:53:56.128577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a7a-account-create-9c44k" event={"ID":"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5","Type":"ContainerStarted","Data":"3b1f890304ef9d089479614d38d81aabcf093a42157b52f334fa1a99c8f86aa6"} Oct 11 10:53:56.129068 master-0 kubenswrapper[4790]: I1011 10:53:56.129045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a7a-account-create-9c44k" event={"ID":"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5","Type":"ContainerStarted","Data":"e7aabdd1f9e81707ceac24468611fbf243bb7fbd9a16bb4d480bb8af844a4e45"} Oct 11 10:53:56.156011 master-0 kubenswrapper[4790]: I1011 10:53:56.155924 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0a7a-account-create-9c44k" podStartSLOduration=1.1559017169999999 podStartE2EDuration="1.155901717s" podCreationTimestamp="2025-10-11 10:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:53:56.155470614 +0000 UTC m=+912.709930916" watchObservedRunningTime="2025-10-11 10:53:56.155901717 +0000 UTC m=+912.710362009" Oct 11 10:53:56.316393 master-0 kubenswrapper[4790]: I1011 10:53:56.316270 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c874e686-cbb2-4de3-8058-382a74b5742d" path="/var/lib/kubelet/pods/c874e686-cbb2-4de3-8058-382a74b5742d/volumes" Oct 11 10:53:56.481783 master-0 kubenswrapper[4790]: I1011 10:53:56.481681 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-af51-account-create-tz8f4"] Oct 11 10:53:56.484357 master-0 kubenswrapper[4790]: W1011 10:53:56.484285 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4838cae2_31c3_4b4d_a914_e95b0b6308be.slice/crio-0bb2113a61f088d79d33b08877923a158ef88b748a4b601ff9e7693ef5d257f9 WatchSource:0}: Error finding container 0bb2113a61f088d79d33b08877923a158ef88b748a4b601ff9e7693ef5d257f9: Status 404 returned error can't find the container with id 0bb2113a61f088d79d33b08877923a158ef88b748a4b601ff9e7693ef5d257f9 Oct 11 10:53:57.138853 master-0 kubenswrapper[4790]: I1011 10:53:57.138761 4790 generic.go:334] "Generic (PLEG): container finished" podID="c21954bc-9fb3-4d4e-8085-b2fcf628e0a5" containerID="3b1f890304ef9d089479614d38d81aabcf093a42157b52f334fa1a99c8f86aa6" exitCode=0 Oct 11 10:53:57.139685 master-0 kubenswrapper[4790]: I1011 10:53:57.138889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a7a-account-create-9c44k" event={"ID":"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5","Type":"ContainerDied","Data":"3b1f890304ef9d089479614d38d81aabcf093a42157b52f334fa1a99c8f86aa6"} Oct 11 10:53:57.141855 master-0 kubenswrapper[4790]: I1011 10:53:57.141779 4790 generic.go:334] "Generic (PLEG): container finished" podID="4838cae2-31c3-4b4d-a914-e95b0b6308be" containerID="677c51ee7ba248bdecdce7b7bb9d050175056a091f08201d76d54e3406eb2697" exitCode=0 Oct 11 10:53:57.141855 master-0 kubenswrapper[4790]: I1011 10:53:57.141832 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-af51-account-create-tz8f4" event={"ID":"4838cae2-31c3-4b4d-a914-e95b0b6308be","Type":"ContainerDied","Data":"677c51ee7ba248bdecdce7b7bb9d050175056a091f08201d76d54e3406eb2697"} Oct 11 10:53:57.142163 master-0 kubenswrapper[4790]: I1011 10:53:57.141888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-af51-account-create-tz8f4" event={"ID":"4838cae2-31c3-4b4d-a914-e95b0b6308be","Type":"ContainerStarted","Data":"0bb2113a61f088d79d33b08877923a158ef88b748a4b601ff9e7693ef5d257f9"} Oct 11 10:53:58.599451 master-0 kubenswrapper[4790]: I1011 10:53:58.599388 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:58.663817 master-0 kubenswrapper[4790]: I1011 10:53:58.661247 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:58.691700 master-0 kubenswrapper[4790]: I1011 10:53:58.691613 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmddg\" (UniqueName: \"kubernetes.io/projected/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5-kube-api-access-dmddg\") pod \"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5\" (UID: \"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5\") " Oct 11 10:53:58.695402 master-0 kubenswrapper[4790]: I1011 10:53:58.695353 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5-kube-api-access-dmddg" (OuterVolumeSpecName: "kube-api-access-dmddg") pod "c21954bc-9fb3-4d4e-8085-b2fcf628e0a5" (UID: "c21954bc-9fb3-4d4e-8085-b2fcf628e0a5"). InnerVolumeSpecName "kube-api-access-dmddg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:53:58.793569 master-0 kubenswrapper[4790]: I1011 10:53:58.793409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldn48\" (UniqueName: \"kubernetes.io/projected/4838cae2-31c3-4b4d-a914-e95b0b6308be-kube-api-access-ldn48\") pod \"4838cae2-31c3-4b4d-a914-e95b0b6308be\" (UID: \"4838cae2-31c3-4b4d-a914-e95b0b6308be\") " Oct 11 10:53:58.794078 master-0 kubenswrapper[4790]: I1011 10:53:58.794039 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmddg\" (UniqueName: \"kubernetes.io/projected/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5-kube-api-access-dmddg\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:58.796856 master-0 kubenswrapper[4790]: I1011 10:53:58.796788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4838cae2-31c3-4b4d-a914-e95b0b6308be-kube-api-access-ldn48" (OuterVolumeSpecName: "kube-api-access-ldn48") pod "4838cae2-31c3-4b4d-a914-e95b0b6308be" (UID: "4838cae2-31c3-4b4d-a914-e95b0b6308be"). InnerVolumeSpecName "kube-api-access-ldn48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:53:58.897053 master-0 kubenswrapper[4790]: I1011 10:53:58.896983 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldn48\" (UniqueName: \"kubernetes.io/projected/4838cae2-31c3-4b4d-a914-e95b0b6308be-kube-api-access-ldn48\") on node \"master-0\" DevicePath \"\"" Oct 11 10:53:59.158417 master-0 kubenswrapper[4790]: I1011 10:53:59.158177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a7a-account-create-9c44k" event={"ID":"c21954bc-9fb3-4d4e-8085-b2fcf628e0a5","Type":"ContainerDied","Data":"e7aabdd1f9e81707ceac24468611fbf243bb7fbd9a16bb4d480bb8af844a4e45"} Oct 11 10:53:59.158417 master-0 kubenswrapper[4790]: I1011 10:53:59.158241 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a7a-account-create-9c44k" Oct 11 10:53:59.158417 master-0 kubenswrapper[4790]: I1011 10:53:59.158251 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7aabdd1f9e81707ceac24468611fbf243bb7fbd9a16bb4d480bb8af844a4e45" Oct 11 10:53:59.159822 master-0 kubenswrapper[4790]: I1011 10:53:59.159801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-af51-account-create-tz8f4" event={"ID":"4838cae2-31c3-4b4d-a914-e95b0b6308be","Type":"ContainerDied","Data":"0bb2113a61f088d79d33b08877923a158ef88b748a4b601ff9e7693ef5d257f9"} Oct 11 10:53:59.159941 master-0 kubenswrapper[4790]: I1011 10:53:59.159824 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb2113a61f088d79d33b08877923a158ef88b748a4b601ff9e7693ef5d257f9" Oct 11 10:53:59.159941 master-0 kubenswrapper[4790]: I1011 10:53:59.159900 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-af51-account-create-tz8f4" Oct 11 10:53:59.709194 master-0 kubenswrapper[4790]: I1011 10:53:59.709014 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Oct 11 10:54:00.031928 master-0 kubenswrapper[4790]: I1011 10:54:00.031849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:54:00.037758 master-0 kubenswrapper[4790]: I1011 10:54:00.037227 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27-etc-swift\") pod \"swift-storage-0\" (UID: \"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27\") " pod="openstack/swift-storage-0" Oct 11 10:54:00.169200 master-0 kubenswrapper[4790]: I1011 10:54:00.169105 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 11 10:54:00.708650 master-0 kubenswrapper[4790]: I1011 10:54:00.708578 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 11 10:54:00.710743 master-0 kubenswrapper[4790]: W1011 10:54:00.710620 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9fc8ec6_f4bb_4b20_a262_f416bb5d2e27.slice/crio-e82dbb6268c1705ebb0fa6f09840b38a2e47b597bdabb61f3383dd3a977d3c51 WatchSource:0}: Error finding container e82dbb6268c1705ebb0fa6f09840b38a2e47b597bdabb61f3383dd3a977d3c51: Status 404 returned error can't find the container with id e82dbb6268c1705ebb0fa6f09840b38a2e47b597bdabb61f3383dd3a977d3c51 Oct 11 10:54:01.181341 master-0 kubenswrapper[4790]: I1011 10:54:01.181259 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"e82dbb6268c1705ebb0fa6f09840b38a2e47b597bdabb61f3383dd3a977d3c51"} Oct 11 10:54:03.201315 master-0 kubenswrapper[4790]: I1011 10:54:03.201243 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"6d0b7d52579ae74fda5ef88219a149ef056d5d599e0ed232bb25bf15c7464b8c"} Oct 11 10:54:04.219528 master-0 kubenswrapper[4790]: I1011 10:54:04.219335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"37820c5383e5d4e38231b402469255b405cc0fdd5561fb4624d2034da9cbb9d0"} Oct 11 10:54:04.219528 master-0 kubenswrapper[4790]: I1011 10:54:04.219407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"6590447afdfc62a9ba2ca9fa5c523b088ed4df3cc12cef99a7ede955e8ce36c3"} Oct 11 10:54:04.219528 master-0 kubenswrapper[4790]: I1011 10:54:04.219421 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"0695998edd7fc240eb37cac4d5324b8c178273057c6c8002869e01e96d4b9981"} Oct 11 10:54:05.231306 master-0 kubenswrapper[4790]: I1011 10:54:05.231250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"867964b4fb417e2b33378e4504b98f42f2d04e7627aa3f2866d64cb2ad2c84f3"} Oct 11 10:54:05.514005 master-0 kubenswrapper[4790]: I1011 10:54:05.513946 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-1" Oct 11 10:54:06.250670 master-0 kubenswrapper[4790]: I1011 10:54:06.250504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"5313bf3db3332a64e56381450b110677c0c55749837d6ead93e1209b714da374"} Oct 11 10:54:06.250670 master-0 kubenswrapper[4790]: I1011 10:54:06.250573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"00a1fe54fd54b62b900d96589f4dbc1f0c193ade025c3d940ed3f5b16684e35e"} Oct 11 10:54:06.250670 master-0 kubenswrapper[4790]: I1011 10:54:06.250588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"83ad5d1bee1a61b4fdd539a237e5a303a9d8452b0460832b0320f9e19bb21759"} Oct 11 10:54:07.290583 master-0 kubenswrapper[4790]: I1011 10:54:07.290519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"bafaa58ca2ae0811bca6369faf1f3eeffaa7f6659d4cd55d3a5f75b343aa6af7"} Oct 11 10:54:08.307350 master-0 kubenswrapper[4790]: I1011 10:54:08.307268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"b10f815cdec53ca320a8e1670becf4898c676fe9c38c1a504586e32f90b9087b"} Oct 11 10:54:08.307350 master-0 kubenswrapper[4790]: I1011 10:54:08.307325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"e584c55d849dfecfe9f65ef5697b782a271d717a25007b147cd6f193a202d356"} Oct 11 10:54:08.307350 master-0 kubenswrapper[4790]: I1011 10:54:08.307340 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"d39148494859b731e1f949702c63e3c8957a139d80a7700e1139a4b645942301"} Oct 11 10:54:08.749249 master-0 kubenswrapper[4790]: I1011 10:54:08.749184 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-7vsxp"] Oct 11 10:54:08.753224 master-0 kubenswrapper[4790]: E1011 10:54:08.753140 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4838cae2-31c3-4b4d-a914-e95b0b6308be" containerName="mariadb-account-create" Oct 11 10:54:08.753224 master-0 kubenswrapper[4790]: I1011 10:54:08.753205 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4838cae2-31c3-4b4d-a914-e95b0b6308be" containerName="mariadb-account-create" Oct 11 10:54:08.753478 master-0 kubenswrapper[4790]: E1011 10:54:08.753243 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21954bc-9fb3-4d4e-8085-b2fcf628e0a5" containerName="mariadb-account-create" Oct 11 10:54:08.753478 master-0 kubenswrapper[4790]: I1011 10:54:08.753255 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21954bc-9fb3-4d4e-8085-b2fcf628e0a5" containerName="mariadb-account-create" Oct 11 10:54:08.753570 master-0 kubenswrapper[4790]: I1011 10:54:08.753480 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21954bc-9fb3-4d4e-8085-b2fcf628e0a5" containerName="mariadb-account-create" Oct 11 10:54:08.753570 master-0 kubenswrapper[4790]: I1011 10:54:08.753504 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4838cae2-31c3-4b4d-a914-e95b0b6308be" containerName="mariadb-account-create" Oct 11 10:54:08.758015 master-0 kubenswrapper[4790]: I1011 10:54:08.755278 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:08.790164 master-0 kubenswrapper[4790]: I1011 10:54:08.789634 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-7vsxp"] Oct 11 10:54:08.869476 master-0 kubenswrapper[4790]: I1011 10:54:08.869343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tfdt\" (UniqueName: \"kubernetes.io/projected/d58f3b14-e8da-4046-afb1-c376a65ef16e-kube-api-access-7tfdt\") pod \"heat-db-create-7vsxp\" (UID: \"d58f3b14-e8da-4046-afb1-c376a65ef16e\") " pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:08.906914 master-0 kubenswrapper[4790]: I1011 10:54:08.906823 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-l7rcp"] Oct 11 10:54:08.908233 master-0 kubenswrapper[4790]: I1011 10:54:08.908202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:08.931429 master-0 kubenswrapper[4790]: I1011 10:54:08.931292 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l7rcp"] Oct 11 10:54:08.971668 master-0 kubenswrapper[4790]: I1011 10:54:08.971581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfdt\" (UniqueName: \"kubernetes.io/projected/d58f3b14-e8da-4046-afb1-c376a65ef16e-kube-api-access-7tfdt\") pod \"heat-db-create-7vsxp\" (UID: \"d58f3b14-e8da-4046-afb1-c376a65ef16e\") " pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:08.994375 master-0 kubenswrapper[4790]: I1011 10:54:08.994298 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tfdt\" (UniqueName: \"kubernetes.io/projected/d58f3b14-e8da-4046-afb1-c376a65ef16e-kube-api-access-7tfdt\") pod \"heat-db-create-7vsxp\" (UID: \"d58f3b14-e8da-4046-afb1-c376a65ef16e\") " pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:09.040952 master-0 kubenswrapper[4790]: I1011 10:54:09.040657 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jmmst"] Oct 11 10:54:09.041867 master-0 kubenswrapper[4790]: I1011 10:54:09.041825 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:09.062534 master-0 kubenswrapper[4790]: I1011 10:54:09.062460 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jmmst"] Oct 11 10:54:09.081732 master-0 kubenswrapper[4790]: I1011 10:54:09.080871 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:09.087732 master-0 kubenswrapper[4790]: I1011 10:54:09.083186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mx4m\" (UniqueName: \"kubernetes.io/projected/dfe51cce-787f-4883-8b8f-f1ed50caa3d3-kube-api-access-7mx4m\") pod \"cinder-db-create-l7rcp\" (UID: \"dfe51cce-787f-4883-8b8f-f1ed50caa3d3\") " pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:09.185858 master-0 kubenswrapper[4790]: I1011 10:54:09.185778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp54h\" (UniqueName: \"kubernetes.io/projected/d40b588a-5009-41c8-b8b0-b417de6693ac-kube-api-access-wp54h\") pod \"neutron-db-create-jmmst\" (UID: \"d40b588a-5009-41c8-b8b0-b417de6693ac\") " pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:09.186125 master-0 kubenswrapper[4790]: I1011 10:54:09.185905 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mx4m\" (UniqueName: \"kubernetes.io/projected/dfe51cce-787f-4883-8b8f-f1ed50caa3d3-kube-api-access-7mx4m\") pod \"cinder-db-create-l7rcp\" (UID: \"dfe51cce-787f-4883-8b8f-f1ed50caa3d3\") " pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:09.213988 master-0 kubenswrapper[4790]: I1011 10:54:09.213747 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mx4m\" (UniqueName: \"kubernetes.io/projected/dfe51cce-787f-4883-8b8f-f1ed50caa3d3-kube-api-access-7mx4m\") pod \"cinder-db-create-l7rcp\" (UID: \"dfe51cce-787f-4883-8b8f-f1ed50caa3d3\") " pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:09.235262 master-0 kubenswrapper[4790]: I1011 10:54:09.235185 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:09.292472 master-0 kubenswrapper[4790]: I1011 10:54:09.292284 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp54h\" (UniqueName: \"kubernetes.io/projected/d40b588a-5009-41c8-b8b0-b417de6693ac-kube-api-access-wp54h\") pod \"neutron-db-create-jmmst\" (UID: \"d40b588a-5009-41c8-b8b0-b417de6693ac\") " pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:09.316364 master-0 kubenswrapper[4790]: I1011 10:54:09.316280 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp54h\" (UniqueName: \"kubernetes.io/projected/d40b588a-5009-41c8-b8b0-b417de6693ac-kube-api-access-wp54h\") pod \"neutron-db-create-jmmst\" (UID: \"d40b588a-5009-41c8-b8b0-b417de6693ac\") " pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:09.364912 master-0 kubenswrapper[4790]: I1011 10:54:09.364630 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:10.371337 master-0 kubenswrapper[4790]: I1011 10:54:10.371252 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"88f16cf90fed3886e443e53b5fb532d3ed1d15ab715a3fc898671a81b5f8041f"} Oct 11 10:54:10.536869 master-0 kubenswrapper[4790]: I1011 10:54:10.536814 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jmmst"] Oct 11 10:54:10.542139 master-0 kubenswrapper[4790]: W1011 10:54:10.541429 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd40b588a_5009_41c8_b8b0_b417de6693ac.slice/crio-21ccb6e31874630690c4dcc47df17c20cab9612171ef0f3a7877de641d394997 WatchSource:0}: Error finding container 21ccb6e31874630690c4dcc47df17c20cab9612171ef0f3a7877de641d394997: Status 404 returned error can't find the container with id 21ccb6e31874630690c4dcc47df17c20cab9612171ef0f3a7877de641d394997 Oct 11 10:54:10.631267 master-0 kubenswrapper[4790]: I1011 10:54:10.631174 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l7rcp"] Oct 11 10:54:10.645538 master-0 kubenswrapper[4790]: W1011 10:54:10.645260 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe51cce_787f_4883_8b8f_f1ed50caa3d3.slice/crio-905e6ab440134109d05c0c22661864d57e4a601ca8e1bc57224de6d60efefdcf WatchSource:0}: Error finding container 905e6ab440134109d05c0c22661864d57e4a601ca8e1bc57224de6d60efefdcf: Status 404 returned error can't find the container with id 905e6ab440134109d05c0c22661864d57e4a601ca8e1bc57224de6d60efefdcf Oct 11 10:54:10.713923 master-0 kubenswrapper[4790]: I1011 10:54:10.713784 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-7vsxp"] Oct 11 10:54:10.748586 master-0 kubenswrapper[4790]: W1011 10:54:10.748521 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd58f3b14_e8da_4046_afb1_c376a65ef16e.slice/crio-bf7f6808c9467f778d59d0800baa52d59254b1f21ee42f31e32aab7d481b8f7a WatchSource:0}: Error finding container bf7f6808c9467f778d59d0800baa52d59254b1f21ee42f31e32aab7d481b8f7a: Status 404 returned error can't find the container with id bf7f6808c9467f778d59d0800baa52d59254b1f21ee42f31e32aab7d481b8f7a Oct 11 10:54:11.381458 master-0 kubenswrapper[4790]: I1011 10:54:11.381368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-7vsxp" event={"ID":"d58f3b14-e8da-4046-afb1-c376a65ef16e","Type":"ContainerStarted","Data":"6c3235d5d6bf2dc75b5b651ccfac846d26bd4957ea4e687fad602fa916728d6b"} Oct 11 10:54:11.381458 master-0 kubenswrapper[4790]: I1011 10:54:11.381430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-7vsxp" event={"ID":"d58f3b14-e8da-4046-afb1-c376a65ef16e","Type":"ContainerStarted","Data":"bf7f6808c9467f778d59d0800baa52d59254b1f21ee42f31e32aab7d481b8f7a"} Oct 11 10:54:11.384550 master-0 kubenswrapper[4790]: I1011 10:54:11.384468 4790 generic.go:334] "Generic (PLEG): container finished" podID="d40b588a-5009-41c8-b8b0-b417de6693ac" containerID="07ec7b09db6fb5294fadc7bd8337f6b789e9b95a2303619665336b8735fa4bfe" exitCode=0 Oct 11 10:54:11.384550 master-0 kubenswrapper[4790]: I1011 10:54:11.384510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jmmst" event={"ID":"d40b588a-5009-41c8-b8b0-b417de6693ac","Type":"ContainerDied","Data":"07ec7b09db6fb5294fadc7bd8337f6b789e9b95a2303619665336b8735fa4bfe"} Oct 11 10:54:11.384550 master-0 kubenswrapper[4790]: I1011 10:54:11.384543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jmmst" event={"ID":"d40b588a-5009-41c8-b8b0-b417de6693ac","Type":"ContainerStarted","Data":"21ccb6e31874630690c4dcc47df17c20cab9612171ef0f3a7877de641d394997"} Oct 11 10:54:11.387077 master-0 kubenswrapper[4790]: I1011 10:54:11.387019 4790 generic.go:334] "Generic (PLEG): container finished" podID="dfe51cce-787f-4883-8b8f-f1ed50caa3d3" containerID="4a77a0e25a1bbd76eb350e88d6052fb5f4963ac556fb275beeaf9d30c06320df" exitCode=0 Oct 11 10:54:11.387361 master-0 kubenswrapper[4790]: I1011 10:54:11.387104 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l7rcp" event={"ID":"dfe51cce-787f-4883-8b8f-f1ed50caa3d3","Type":"ContainerDied","Data":"4a77a0e25a1bbd76eb350e88d6052fb5f4963ac556fb275beeaf9d30c06320df"} Oct 11 10:54:11.387361 master-0 kubenswrapper[4790]: I1011 10:54:11.387133 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l7rcp" event={"ID":"dfe51cce-787f-4883-8b8f-f1ed50caa3d3","Type":"ContainerStarted","Data":"905e6ab440134109d05c0c22661864d57e4a601ca8e1bc57224de6d60efefdcf"} Oct 11 10:54:11.405810 master-0 kubenswrapper[4790]: I1011 10:54:11.398893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"ad43c9b41681c6e6739e54f5322e411f22ca601d6079dba8191993837f1dc376"} Oct 11 10:54:11.405810 master-0 kubenswrapper[4790]: I1011 10:54:11.398978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c9fc8ec6-f4bb-4b20-a262-f416bb5d2e27","Type":"ContainerStarted","Data":"b29805ffd3e058a9848dcd6fafa8543718c5c95eef835aa813c84a10a47e1889"} Oct 11 10:54:11.648552 master-0 kubenswrapper[4790]: I1011 10:54:11.648449 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-7vsxp" podStartSLOduration=3.648420251 podStartE2EDuration="3.648420251s" podCreationTimestamp="2025-10-11 10:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:54:11.622022649 +0000 UTC m=+928.176482941" watchObservedRunningTime="2025-10-11 10:54:11.648420251 +0000 UTC m=+928.202880543" Oct 11 10:54:11.886108 master-0 kubenswrapper[4790]: I1011 10:54:11.886008 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.555880263 podStartE2EDuration="30.885982186s" podCreationTimestamp="2025-10-11 10:53:41 +0000 UTC" firstStartedPulling="2025-10-11 10:54:00.714254151 +0000 UTC m=+917.268714473" lastFinishedPulling="2025-10-11 10:54:10.044356104 +0000 UTC m=+926.598816396" observedRunningTime="2025-10-11 10:54:11.881590833 +0000 UTC m=+928.436051165" watchObservedRunningTime="2025-10-11 10:54:11.885982186 +0000 UTC m=+928.440442478" Oct 11 10:54:12.409129 master-0 kubenswrapper[4790]: I1011 10:54:12.409042 4790 generic.go:334] "Generic (PLEG): container finished" podID="d58f3b14-e8da-4046-afb1-c376a65ef16e" containerID="6c3235d5d6bf2dc75b5b651ccfac846d26bd4957ea4e687fad602fa916728d6b" exitCode=0 Oct 11 10:54:12.410023 master-0 kubenswrapper[4790]: I1011 10:54:12.409130 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-7vsxp" event={"ID":"d58f3b14-e8da-4046-afb1-c376a65ef16e","Type":"ContainerDied","Data":"6c3235d5d6bf2dc75b5b651ccfac846d26bd4957ea4e687fad602fa916728d6b"} Oct 11 10:54:12.787456 master-0 kubenswrapper[4790]: I1011 10:54:12.787408 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:12.872611 master-0 kubenswrapper[4790]: I1011 10:54:12.872508 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mx4m\" (UniqueName: \"kubernetes.io/projected/dfe51cce-787f-4883-8b8f-f1ed50caa3d3-kube-api-access-7mx4m\") pod \"dfe51cce-787f-4883-8b8f-f1ed50caa3d3\" (UID: \"dfe51cce-787f-4883-8b8f-f1ed50caa3d3\") " Oct 11 10:54:12.875664 master-0 kubenswrapper[4790]: I1011 10:54:12.875591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe51cce-787f-4883-8b8f-f1ed50caa3d3-kube-api-access-7mx4m" (OuterVolumeSpecName: "kube-api-access-7mx4m") pod "dfe51cce-787f-4883-8b8f-f1ed50caa3d3" (UID: "dfe51cce-787f-4883-8b8f-f1ed50caa3d3"). InnerVolumeSpecName "kube-api-access-7mx4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:12.928982 master-0 kubenswrapper[4790]: I1011 10:54:12.928922 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:12.974648 master-0 kubenswrapper[4790]: I1011 10:54:12.974510 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mx4m\" (UniqueName: \"kubernetes.io/projected/dfe51cce-787f-4883-8b8f-f1ed50caa3d3-kube-api-access-7mx4m\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:13.075915 master-0 kubenswrapper[4790]: I1011 10:54:13.075819 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp54h\" (UniqueName: \"kubernetes.io/projected/d40b588a-5009-41c8-b8b0-b417de6693ac-kube-api-access-wp54h\") pod \"d40b588a-5009-41c8-b8b0-b417de6693ac\" (UID: \"d40b588a-5009-41c8-b8b0-b417de6693ac\") " Oct 11 10:54:13.079057 master-0 kubenswrapper[4790]: I1011 10:54:13.079009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40b588a-5009-41c8-b8b0-b417de6693ac-kube-api-access-wp54h" (OuterVolumeSpecName: "kube-api-access-wp54h") pod "d40b588a-5009-41c8-b8b0-b417de6693ac" (UID: "d40b588a-5009-41c8-b8b0-b417de6693ac"). InnerVolumeSpecName "kube-api-access-wp54h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:13.178583 master-0 kubenswrapper[4790]: I1011 10:54:13.178468 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp54h\" (UniqueName: \"kubernetes.io/projected/d40b588a-5009-41c8-b8b0-b417de6693ac-kube-api-access-wp54h\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:13.421466 master-0 kubenswrapper[4790]: I1011 10:54:13.421346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jmmst" event={"ID":"d40b588a-5009-41c8-b8b0-b417de6693ac","Type":"ContainerDied","Data":"21ccb6e31874630690c4dcc47df17c20cab9612171ef0f3a7877de641d394997"} Oct 11 10:54:13.421466 master-0 kubenswrapper[4790]: I1011 10:54:13.421434 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ccb6e31874630690c4dcc47df17c20cab9612171ef0f3a7877de641d394997" Oct 11 10:54:13.421466 master-0 kubenswrapper[4790]: I1011 10:54:13.421433 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jmmst" Oct 11 10:54:13.423884 master-0 kubenswrapper[4790]: I1011 10:54:13.423797 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l7rcp" event={"ID":"dfe51cce-787f-4883-8b8f-f1ed50caa3d3","Type":"ContainerDied","Data":"905e6ab440134109d05c0c22661864d57e4a601ca8e1bc57224de6d60efefdcf"} Oct 11 10:54:13.423984 master-0 kubenswrapper[4790]: I1011 10:54:13.423886 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l7rcp" Oct 11 10:54:13.424121 master-0 kubenswrapper[4790]: I1011 10:54:13.423892 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905e6ab440134109d05c0c22661864d57e4a601ca8e1bc57224de6d60efefdcf" Oct 11 10:54:13.905685 master-0 kubenswrapper[4790]: I1011 10:54:13.905608 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:13.999027 master-0 kubenswrapper[4790]: I1011 10:54:13.998846 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tfdt\" (UniqueName: \"kubernetes.io/projected/d58f3b14-e8da-4046-afb1-c376a65ef16e-kube-api-access-7tfdt\") pod \"d58f3b14-e8da-4046-afb1-c376a65ef16e\" (UID: \"d58f3b14-e8da-4046-afb1-c376a65ef16e\") " Oct 11 10:54:14.002053 master-0 kubenswrapper[4790]: I1011 10:54:14.001986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58f3b14-e8da-4046-afb1-c376a65ef16e-kube-api-access-7tfdt" (OuterVolumeSpecName: "kube-api-access-7tfdt") pod "d58f3b14-e8da-4046-afb1-c376a65ef16e" (UID: "d58f3b14-e8da-4046-afb1-c376a65ef16e"). InnerVolumeSpecName "kube-api-access-7tfdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:14.104737 master-0 kubenswrapper[4790]: I1011 10:54:14.104225 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tfdt\" (UniqueName: \"kubernetes.io/projected/d58f3b14-e8da-4046-afb1-c376a65ef16e-kube-api-access-7tfdt\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:14.432799 master-0 kubenswrapper[4790]: I1011 10:54:14.432682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-7vsxp" event={"ID":"d58f3b14-e8da-4046-afb1-c376a65ef16e","Type":"ContainerDied","Data":"bf7f6808c9467f778d59d0800baa52d59254b1f21ee42f31e32aab7d481b8f7a"} Oct 11 10:54:14.432799 master-0 kubenswrapper[4790]: I1011 10:54:14.432795 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf7f6808c9467f778d59d0800baa52d59254b1f21ee42f31e32aab7d481b8f7a" Oct 11 10:54:14.433718 master-0 kubenswrapper[4790]: I1011 10:54:14.433671 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-7vsxp" Oct 11 10:54:22.410926 master-0 kubenswrapper[4790]: I1011 10:54:22.410864 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-gvzlv"] Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: E1011 10:54:22.411195 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe51cce-787f-4883-8b8f-f1ed50caa3d3" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: I1011 10:54:22.411210 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe51cce-787f-4883-8b8f-f1ed50caa3d3" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: E1011 10:54:22.411224 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40b588a-5009-41c8-b8b0-b417de6693ac" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: I1011 10:54:22.411231 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40b588a-5009-41c8-b8b0-b417de6693ac" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: E1011 10:54:22.411250 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58f3b14-e8da-4046-afb1-c376a65ef16e" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: I1011 10:54:22.411257 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58f3b14-e8da-4046-afb1-c376a65ef16e" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: I1011 10:54:22.411404 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe51cce-787f-4883-8b8f-f1ed50caa3d3" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: I1011 10:54:22.411426 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40b588a-5009-41c8-b8b0-b417de6693ac" containerName="mariadb-database-create" Oct 11 10:54:22.411678 master-0 kubenswrapper[4790]: I1011 10:54:22.411435 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58f3b14-e8da-4046-afb1-c376a65ef16e" containerName="mariadb-database-create" Oct 11 10:54:22.412083 master-0 kubenswrapper[4790]: I1011 10:54:22.412052 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:22.455798 master-0 kubenswrapper[4790]: I1011 10:54:22.453093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-gvzlv"] Oct 11 10:54:22.556262 master-0 kubenswrapper[4790]: I1011 10:54:22.556210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2hcp\" (UniqueName: \"kubernetes.io/projected/2137512f-c759-4935-944d-48248c99c2ec-kube-api-access-p2hcp\") pod \"ironic-db-create-gvzlv\" (UID: \"2137512f-c759-4935-944d-48248c99c2ec\") " pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:22.657868 master-0 kubenswrapper[4790]: I1011 10:54:22.657803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2hcp\" (UniqueName: \"kubernetes.io/projected/2137512f-c759-4935-944d-48248c99c2ec-kube-api-access-p2hcp\") pod \"ironic-db-create-gvzlv\" (UID: \"2137512f-c759-4935-944d-48248c99c2ec\") " pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:22.687678 master-0 kubenswrapper[4790]: I1011 10:54:22.687516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2hcp\" (UniqueName: \"kubernetes.io/projected/2137512f-c759-4935-944d-48248c99c2ec-kube-api-access-p2hcp\") pod \"ironic-db-create-gvzlv\" (UID: \"2137512f-c759-4935-944d-48248c99c2ec\") " pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:22.741382 master-0 kubenswrapper[4790]: I1011 10:54:22.741324 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:23.179336 master-0 kubenswrapper[4790]: I1011 10:54:23.179289 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-gvzlv"] Oct 11 10:54:23.522408 master-0 kubenswrapper[4790]: I1011 10:54:23.522305 4790 generic.go:334] "Generic (PLEG): container finished" podID="2137512f-c759-4935-944d-48248c99c2ec" containerID="fa9c7f461b0e315bcd532cda39de483b7f3baaed2714bed160ee9f75fc0f43db" exitCode=0 Oct 11 10:54:23.522408 master-0 kubenswrapper[4790]: I1011 10:54:23.522391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-gvzlv" event={"ID":"2137512f-c759-4935-944d-48248c99c2ec","Type":"ContainerDied","Data":"fa9c7f461b0e315bcd532cda39de483b7f3baaed2714bed160ee9f75fc0f43db"} Oct 11 10:54:23.522408 master-0 kubenswrapper[4790]: I1011 10:54:23.522427 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-gvzlv" event={"ID":"2137512f-c759-4935-944d-48248c99c2ec","Type":"ContainerStarted","Data":"2c4922c1ad065c1a799c822a668072dcac70cccb7bbd31842edf52a4efb72f91"} Oct 11 10:54:24.935546 master-0 kubenswrapper[4790]: I1011 10:54:24.935466 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:25.000407 master-0 kubenswrapper[4790]: I1011 10:54:25.000347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2hcp\" (UniqueName: \"kubernetes.io/projected/2137512f-c759-4935-944d-48248c99c2ec-kube-api-access-p2hcp\") pod \"2137512f-c759-4935-944d-48248c99c2ec\" (UID: \"2137512f-c759-4935-944d-48248c99c2ec\") " Oct 11 10:54:25.004124 master-0 kubenswrapper[4790]: I1011 10:54:25.004084 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2137512f-c759-4935-944d-48248c99c2ec-kube-api-access-p2hcp" (OuterVolumeSpecName: "kube-api-access-p2hcp") pod "2137512f-c759-4935-944d-48248c99c2ec" (UID: "2137512f-c759-4935-944d-48248c99c2ec"). InnerVolumeSpecName "kube-api-access-p2hcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:25.103049 master-0 kubenswrapper[4790]: I1011 10:54:25.102976 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2hcp\" (UniqueName: \"kubernetes.io/projected/2137512f-c759-4935-944d-48248c99c2ec-kube-api-access-p2hcp\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:25.545834 master-0 kubenswrapper[4790]: I1011 10:54:25.545663 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-gvzlv" event={"ID":"2137512f-c759-4935-944d-48248c99c2ec","Type":"ContainerDied","Data":"2c4922c1ad065c1a799c822a668072dcac70cccb7bbd31842edf52a4efb72f91"} Oct 11 10:54:25.545834 master-0 kubenswrapper[4790]: I1011 10:54:25.545767 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-gvzlv" Oct 11 10:54:25.545834 master-0 kubenswrapper[4790]: I1011 10:54:25.545764 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c4922c1ad065c1a799c822a668072dcac70cccb7bbd31842edf52a4efb72f91" Oct 11 10:54:28.343624 master-0 kubenswrapper[4790]: I1011 10:54:28.343567 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:28.344340 master-0 kubenswrapper[4790]: E1011 10:54:28.344194 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2137512f-c759-4935-944d-48248c99c2ec" containerName="mariadb-database-create" Oct 11 10:54:28.344340 master-0 kubenswrapper[4790]: I1011 10:54:28.344210 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2137512f-c759-4935-944d-48248c99c2ec" containerName="mariadb-database-create" Oct 11 10:54:28.344836 master-0 kubenswrapper[4790]: I1011 10:54:28.344529 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2137512f-c759-4935-944d-48248c99c2ec" containerName="mariadb-database-create" Oct 11 10:54:28.346121 master-0 kubenswrapper[4790]: I1011 10:54:28.346089 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.350731 master-0 kubenswrapper[4790]: I1011 10:54:28.350660 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 11 10:54:28.351099 master-0 kubenswrapper[4790]: I1011 10:54:28.351070 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 11 10:54:28.351386 master-0 kubenswrapper[4790]: I1011 10:54:28.351358 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b5802-default-external-config-data" Oct 11 10:54:28.355873 master-0 kubenswrapper[4790]: I1011 10:54:28.355819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.524547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.524625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.524653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.524823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.524892 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.525139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.525182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvf6\" (UniqueName: \"kubernetes.io/projected/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-kube-api-access-2wvf6\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.525741 master-0 kubenswrapper[4790]: I1011 10:54:28.525210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.627155 master-0 kubenswrapper[4790]: I1011 10:54:28.627077 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.627155 master-0 kubenswrapper[4790]: I1011 10:54:28.627132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvf6\" (UniqueName: \"kubernetes.io/projected/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-kube-api-access-2wvf6\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.627155 master-0 kubenswrapper[4790]: I1011 10:54:28.627155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627323 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630111 master-0 kubenswrapper[4790]: I1011 10:54:28.627899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.630576 master-0 kubenswrapper[4790]: I1011 10:54:28.630526 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:54:28.630666 master-0 kubenswrapper[4790]: I1011 10:54:28.630591 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/95ab3ea1c73b905e55aa0f0a1e574a5056ec96dde23978388ab58fbe89465472/globalmount\"" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.632439 master-0 kubenswrapper[4790]: I1011 10:54:28.632375 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.633067 master-0 kubenswrapper[4790]: I1011 10:54:28.632989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.633599 master-0 kubenswrapper[4790]: I1011 10:54:28.633562 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.646221 master-0 kubenswrapper[4790]: I1011 10:54:28.646177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvf6\" (UniqueName: \"kubernetes.io/projected/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-kube-api-access-2wvf6\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.647370 master-0 kubenswrapper[4790]: I1011 10:54:28.647327 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:28.790326 master-0 kubenswrapper[4790]: I1011 10:54:28.790162 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-9ac8-account-create-r5rxs"] Oct 11 10:54:28.791407 master-0 kubenswrapper[4790]: I1011 10:54:28.791374 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:28.795083 master-0 kubenswrapper[4790]: I1011 10:54:28.794992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 11 10:54:28.799842 master-0 kubenswrapper[4790]: I1011 10:54:28.799773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9ac8-account-create-r5rxs"] Oct 11 10:54:28.938231 master-0 kubenswrapper[4790]: I1011 10:54:28.937882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45c75\" (UniqueName: \"kubernetes.io/projected/cdd4a60e-f24a-48fe-afcb-c7ccab615f69-kube-api-access-45c75\") pod \"heat-9ac8-account-create-r5rxs\" (UID: \"cdd4a60e-f24a-48fe-afcb-c7ccab615f69\") " pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:28.973032 master-0 kubenswrapper[4790]: I1011 10:54:28.972955 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b634-account-create-vb2w7"] Oct 11 10:54:28.974291 master-0 kubenswrapper[4790]: I1011 10:54:28.974265 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:28.977548 master-0 kubenswrapper[4790]: I1011 10:54:28.977512 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 11 10:54:28.985005 master-0 kubenswrapper[4790]: I1011 10:54:28.984960 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b634-account-create-vb2w7"] Oct 11 10:54:29.015743 master-0 kubenswrapper[4790]: I1011 10:54:29.015645 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:54:29.017447 master-0 kubenswrapper[4790]: I1011 10:54:29.016988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.020955 master-0 kubenswrapper[4790]: I1011 10:54:29.020890 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b5802-default-internal-config-data" Oct 11 10:54:29.022924 master-0 kubenswrapper[4790]: I1011 10:54:29.022891 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 11 10:54:29.027491 master-0 kubenswrapper[4790]: I1011 10:54:29.027429 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:54:29.049056 master-0 kubenswrapper[4790]: I1011 10:54:29.039470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45c75\" (UniqueName: \"kubernetes.io/projected/cdd4a60e-f24a-48fe-afcb-c7ccab615f69-kube-api-access-45c75\") pod \"heat-9ac8-account-create-r5rxs\" (UID: \"cdd4a60e-f24a-48fe-afcb-c7ccab615f69\") " pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:29.061672 master-0 kubenswrapper[4790]: I1011 10:54:29.061604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45c75\" (UniqueName: \"kubernetes.io/projected/cdd4a60e-f24a-48fe-afcb-c7ccab615f69-kube-api-access-45c75\") pod \"heat-9ac8-account-create-r5rxs\" (UID: \"cdd4a60e-f24a-48fe-afcb-c7ccab615f69\") " pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:29.117732 master-0 kubenswrapper[4790]: I1011 10:54:29.117133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:29.140841 master-0 kubenswrapper[4790]: I1011 10:54:29.140803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.140941 master-0 kubenswrapper[4790]: I1011 10:54:29.140882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-combined-ca-bundle\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.140941 master-0 kubenswrapper[4790]: I1011 10:54:29.140915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-config-data\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.141001 master-0 kubenswrapper[4790]: I1011 10:54:29.140939 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-logs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.141001 master-0 kubenswrapper[4790]: I1011 10:54:29.140975 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-scripts\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.141077 master-0 kubenswrapper[4790]: I1011 10:54:29.141017 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczqt\" (UniqueName: \"kubernetes.io/projected/a0a5aa40-0146-4b81-83dd-761d514c557a-kube-api-access-vczqt\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.141077 master-0 kubenswrapper[4790]: I1011 10:54:29.141037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-httpd-run\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.141077 master-0 kubenswrapper[4790]: I1011 10:54:29.141063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkpf5\" (UniqueName: \"kubernetes.io/projected/09ddf95f-6e9c-4f3c-b742-87379c6594b2-kube-api-access-xkpf5\") pod \"cinder-b634-account-create-vb2w7\" (UID: \"09ddf95f-6e9c-4f3c-b742-87379c6594b2\") " pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:29.141077 master-0 kubenswrapper[4790]: I1011 10:54:29.141274 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-internal-tls-certs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.171858 master-0 kubenswrapper[4790]: I1011 10:54:29.171671 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2033-account-create-jh9gc"] Oct 11 10:54:29.173189 master-0 kubenswrapper[4790]: I1011 10:54:29.173169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:29.175780 master-0 kubenswrapper[4790]: I1011 10:54:29.175760 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 11 10:54:29.184241 master-0 kubenswrapper[4790]: I1011 10:54:29.183619 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2033-account-create-jh9gc"] Oct 11 10:54:29.243199 master-0 kubenswrapper[4790]: I1011 10:54:29.243124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.243534 master-0 kubenswrapper[4790]: I1011 10:54:29.243512 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-combined-ca-bundle\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.243650 master-0 kubenswrapper[4790]: I1011 10:54:29.243630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-config-data\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.243800 master-0 kubenswrapper[4790]: I1011 10:54:29.243781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-logs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.243935 master-0 kubenswrapper[4790]: I1011 10:54:29.243917 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-scripts\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.244088 master-0 kubenswrapper[4790]: I1011 10:54:29.244070 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczqt\" (UniqueName: \"kubernetes.io/projected/a0a5aa40-0146-4b81-83dd-761d514c557a-kube-api-access-vczqt\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.244201 master-0 kubenswrapper[4790]: I1011 10:54:29.244182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-httpd-run\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.244298 master-0 kubenswrapper[4790]: I1011 10:54:29.244281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkpf5\" (UniqueName: \"kubernetes.io/projected/09ddf95f-6e9c-4f3c-b742-87379c6594b2-kube-api-access-xkpf5\") pod \"cinder-b634-account-create-vb2w7\" (UID: \"09ddf95f-6e9c-4f3c-b742-87379c6594b2\") " pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:29.244383 master-0 kubenswrapper[4790]: I1011 10:54:29.244370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-internal-tls-certs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.245568 master-0 kubenswrapper[4790]: I1011 10:54:29.245549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-logs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.246249 master-0 kubenswrapper[4790]: I1011 10:54:29.246192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-httpd-run\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.251222 master-0 kubenswrapper[4790]: I1011 10:54:29.251182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-combined-ca-bundle\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.252167 master-0 kubenswrapper[4790]: I1011 10:54:29.252146 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:54:29.252310 master-0 kubenswrapper[4790]: I1011 10:54:29.252266 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/b0c7c7eacbecbf6beec44181cd1a14327b215e622b505cc0fbc4653c9c57c6ce/globalmount\"" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.252807 master-0 kubenswrapper[4790]: I1011 10:54:29.252399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-config-data\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.252807 master-0 kubenswrapper[4790]: I1011 10:54:29.252689 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-internal-tls-certs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.254742 master-0 kubenswrapper[4790]: I1011 10:54:29.254680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-scripts\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.271543 master-0 kubenswrapper[4790]: I1011 10:54:29.270535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczqt\" (UniqueName: \"kubernetes.io/projected/a0a5aa40-0146-4b81-83dd-761d514c557a-kube-api-access-vczqt\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:29.274669 master-0 kubenswrapper[4790]: I1011 10:54:29.274446 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkpf5\" (UniqueName: \"kubernetes.io/projected/09ddf95f-6e9c-4f3c-b742-87379c6594b2-kube-api-access-xkpf5\") pod \"cinder-b634-account-create-vb2w7\" (UID: \"09ddf95f-6e9c-4f3c-b742-87379c6594b2\") " pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:29.311237 master-0 kubenswrapper[4790]: I1011 10:54:29.311038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:29.346426 master-0 kubenswrapper[4790]: I1011 10:54:29.346342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjx4\" (UniqueName: \"kubernetes.io/projected/08a325c6-b9b6-495b-87dc-d6e12b3f1029-kube-api-access-gbjx4\") pod \"neutron-2033-account-create-jh9gc\" (UID: \"08a325c6-b9b6-495b-87dc-d6e12b3f1029\") " pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:29.448531 master-0 kubenswrapper[4790]: I1011 10:54:29.448458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjx4\" (UniqueName: \"kubernetes.io/projected/08a325c6-b9b6-495b-87dc-d6e12b3f1029-kube-api-access-gbjx4\") pod \"neutron-2033-account-create-jh9gc\" (UID: \"08a325c6-b9b6-495b-87dc-d6e12b3f1029\") " pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:29.488823 master-0 kubenswrapper[4790]: I1011 10:54:29.488460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjx4\" (UniqueName: \"kubernetes.io/projected/08a325c6-b9b6-495b-87dc-d6e12b3f1029-kube-api-access-gbjx4\") pod \"neutron-2033-account-create-jh9gc\" (UID: \"08a325c6-b9b6-495b-87dc-d6e12b3f1029\") " pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:29.539033 master-0 kubenswrapper[4790]: I1011 10:54:29.538864 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:29.608149 master-0 kubenswrapper[4790]: I1011 10:54:29.608080 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9ac8-account-create-r5rxs"] Oct 11 10:54:29.761757 master-0 kubenswrapper[4790]: I1011 10:54:29.760313 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b634-account-create-vb2w7"] Oct 11 10:54:29.794841 master-0 kubenswrapper[4790]: W1011 10:54:29.794473 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09ddf95f_6e9c_4f3c_b742_87379c6594b2.slice/crio-4a0da43bdbb6cb8eab5b1d1a5b54d52710e81cb9cc902f6c2bb9187ab577a089 WatchSource:0}: Error finding container 4a0da43bdbb6cb8eab5b1d1a5b54d52710e81cb9cc902f6c2bb9187ab577a089: Status 404 returned error can't find the container with id 4a0da43bdbb6cb8eab5b1d1a5b54d52710e81cb9cc902f6c2bb9187ab577a089 Oct 11 10:54:30.073293 master-0 kubenswrapper[4790]: I1011 10:54:30.073198 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2033-account-create-jh9gc"] Oct 11 10:54:30.593724 master-0 kubenswrapper[4790]: I1011 10:54:30.593646 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:30.608153 master-0 kubenswrapper[4790]: I1011 10:54:30.608090 4790 generic.go:334] "Generic (PLEG): container finished" podID="09ddf95f-6e9c-4f3c-b742-87379c6594b2" containerID="fd9735379426d4418da18546aac8b7806a6015a386e483f957b980d675840314" exitCode=0 Oct 11 10:54:30.608355 master-0 kubenswrapper[4790]: I1011 10:54:30.608166 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b634-account-create-vb2w7" event={"ID":"09ddf95f-6e9c-4f3c-b742-87379c6594b2","Type":"ContainerDied","Data":"fd9735379426d4418da18546aac8b7806a6015a386e483f957b980d675840314"} Oct 11 10:54:30.608355 master-0 kubenswrapper[4790]: I1011 10:54:30.608256 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b634-account-create-vb2w7" event={"ID":"09ddf95f-6e9c-4f3c-b742-87379c6594b2","Type":"ContainerStarted","Data":"4a0da43bdbb6cb8eab5b1d1a5b54d52710e81cb9cc902f6c2bb9187ab577a089"} Oct 11 10:54:30.610163 master-0 kubenswrapper[4790]: I1011 10:54:30.610110 4790 generic.go:334] "Generic (PLEG): container finished" podID="08a325c6-b9b6-495b-87dc-d6e12b3f1029" containerID="5ca20afffe15faa31f5c2c1443a96be8fe5b0268275280368238f1f4b32ef4f2" exitCode=0 Oct 11 10:54:30.610221 master-0 kubenswrapper[4790]: I1011 10:54:30.610196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2033-account-create-jh9gc" event={"ID":"08a325c6-b9b6-495b-87dc-d6e12b3f1029","Type":"ContainerDied","Data":"5ca20afffe15faa31f5c2c1443a96be8fe5b0268275280368238f1f4b32ef4f2"} Oct 11 10:54:30.610299 master-0 kubenswrapper[4790]: I1011 10:54:30.610284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2033-account-create-jh9gc" event={"ID":"08a325c6-b9b6-495b-87dc-d6e12b3f1029","Type":"ContainerStarted","Data":"064a003115b13d4cbf140f9fcffdd93b2d44775212a0623054da53071870c839"} Oct 11 10:54:30.613835 master-0 kubenswrapper[4790]: I1011 10:54:30.613802 4790 generic.go:334] "Generic (PLEG): container finished" podID="cdd4a60e-f24a-48fe-afcb-c7ccab615f69" containerID="050e70bb3b6a03db41f9fcb784b5238c3ff9d94ed85c503d6f9f58f7bd27daa0" exitCode=0 Oct 11 10:54:30.613835 master-0 kubenswrapper[4790]: I1011 10:54:30.613831 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9ac8-account-create-r5rxs" event={"ID":"cdd4a60e-f24a-48fe-afcb-c7ccab615f69","Type":"ContainerDied","Data":"050e70bb3b6a03db41f9fcb784b5238c3ff9d94ed85c503d6f9f58f7bd27daa0"} Oct 11 10:54:30.614071 master-0 kubenswrapper[4790]: I1011 10:54:30.613849 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9ac8-account-create-r5rxs" event={"ID":"cdd4a60e-f24a-48fe-afcb-c7ccab615f69","Type":"ContainerStarted","Data":"d6ae65f963950836b1a177c858bc761c8e099ea553721fa061ba026946ca1a96"} Oct 11 10:54:30.818199 master-0 kubenswrapper[4790]: I1011 10:54:30.818133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:31.397833 master-0 kubenswrapper[4790]: I1011 10:54:31.397734 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:31.404854 master-0 kubenswrapper[4790]: W1011 10:54:31.404792 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ebc6e3_ce04_4aac_bb04_ded9662f65e3.slice/crio-ba85d8283f47fbc71a069e2badeb37c9ec3baa560e78e0e496d6945e23ef982f WatchSource:0}: Error finding container ba85d8283f47fbc71a069e2badeb37c9ec3baa560e78e0e496d6945e23ef982f: Status 404 returned error can't find the container with id ba85d8283f47fbc71a069e2badeb37c9ec3baa560e78e0e496d6945e23ef982f Oct 11 10:54:31.622082 master-0 kubenswrapper[4790]: I1011 10:54:31.621962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"06ebc6e3-ce04-4aac-bb04-ded9662f65e3","Type":"ContainerStarted","Data":"ba85d8283f47fbc71a069e2badeb37c9ec3baa560e78e0e496d6945e23ef982f"} Oct 11 10:54:32.103410 master-0 kubenswrapper[4790]: I1011 10:54:32.103351 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:32.226764 master-0 kubenswrapper[4790]: I1011 10:54:32.226666 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkpf5\" (UniqueName: \"kubernetes.io/projected/09ddf95f-6e9c-4f3c-b742-87379c6594b2-kube-api-access-xkpf5\") pod \"09ddf95f-6e9c-4f3c-b742-87379c6594b2\" (UID: \"09ddf95f-6e9c-4f3c-b742-87379c6594b2\") " Oct 11 10:54:32.229473 master-0 kubenswrapper[4790]: I1011 10:54:32.229408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ddf95f-6e9c-4f3c-b742-87379c6594b2-kube-api-access-xkpf5" (OuterVolumeSpecName: "kube-api-access-xkpf5") pod "09ddf95f-6e9c-4f3c-b742-87379c6594b2" (UID: "09ddf95f-6e9c-4f3c-b742-87379c6594b2"). InnerVolumeSpecName "kube-api-access-xkpf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:32.276351 master-0 kubenswrapper[4790]: I1011 10:54:32.276297 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:32.279349 master-0 kubenswrapper[4790]: I1011 10:54:32.279305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:32.282844 master-0 kubenswrapper[4790]: I1011 10:54:32.282800 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:32.330954 master-0 kubenswrapper[4790]: I1011 10:54:32.330893 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkpf5\" (UniqueName: \"kubernetes.io/projected/09ddf95f-6e9c-4f3c-b742-87379c6594b2-kube-api-access-xkpf5\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:32.333568 master-0 kubenswrapper[4790]: I1011 10:54:32.333480 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:32.432061 master-0 kubenswrapper[4790]: I1011 10:54:32.431903 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjx4\" (UniqueName: \"kubernetes.io/projected/08a325c6-b9b6-495b-87dc-d6e12b3f1029-kube-api-access-gbjx4\") pod \"08a325c6-b9b6-495b-87dc-d6e12b3f1029\" (UID: \"08a325c6-b9b6-495b-87dc-d6e12b3f1029\") " Oct 11 10:54:32.432061 master-0 kubenswrapper[4790]: I1011 10:54:32.432020 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45c75\" (UniqueName: \"kubernetes.io/projected/cdd4a60e-f24a-48fe-afcb-c7ccab615f69-kube-api-access-45c75\") pod \"cdd4a60e-f24a-48fe-afcb-c7ccab615f69\" (UID: \"cdd4a60e-f24a-48fe-afcb-c7ccab615f69\") " Oct 11 10:54:32.439084 master-0 kubenswrapper[4790]: I1011 10:54:32.437808 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a325c6-b9b6-495b-87dc-d6e12b3f1029-kube-api-access-gbjx4" (OuterVolumeSpecName: "kube-api-access-gbjx4") pod "08a325c6-b9b6-495b-87dc-d6e12b3f1029" (UID: "08a325c6-b9b6-495b-87dc-d6e12b3f1029"). InnerVolumeSpecName "kube-api-access-gbjx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:32.440666 master-0 kubenswrapper[4790]: I1011 10:54:32.440155 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd4a60e-f24a-48fe-afcb-c7ccab615f69-kube-api-access-45c75" (OuterVolumeSpecName: "kube-api-access-45c75") pod "cdd4a60e-f24a-48fe-afcb-c7ccab615f69" (UID: "cdd4a60e-f24a-48fe-afcb-c7ccab615f69"). InnerVolumeSpecName "kube-api-access-45c75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:32.534227 master-0 kubenswrapper[4790]: I1011 10:54:32.534113 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-e1bf-account-create-9qds8"] Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: E1011 10:54:32.534466 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd4a60e-f24a-48fe-afcb-c7ccab615f69" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534483 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd4a60e-f24a-48fe-afcb-c7ccab615f69" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: E1011 10:54:32.534527 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ddf95f-6e9c-4f3c-b742-87379c6594b2" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534544 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ddf95f-6e9c-4f3c-b742-87379c6594b2" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: E1011 10:54:32.534564 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a325c6-b9b6-495b-87dc-d6e12b3f1029" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534572 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a325c6-b9b6-495b-87dc-d6e12b3f1029" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534691 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a325c6-b9b6-495b-87dc-d6e12b3f1029" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534519 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjx4\" (UniqueName: \"kubernetes.io/projected/08a325c6-b9b6-495b-87dc-d6e12b3f1029-kube-api-access-gbjx4\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534730 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd4a60e-f24a-48fe-afcb-c7ccab615f69" containerName="mariadb-account-create" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534739 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45c75\" (UniqueName: \"kubernetes.io/projected/cdd4a60e-f24a-48fe-afcb-c7ccab615f69-kube-api-access-45c75\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:32.534909 master-0 kubenswrapper[4790]: I1011 10:54:32.534744 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ddf95f-6e9c-4f3c-b742-87379c6594b2" containerName="mariadb-account-create" Oct 11 10:54:32.537560 master-0 kubenswrapper[4790]: I1011 10:54:32.537005 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:32.542027 master-0 kubenswrapper[4790]: I1011 10:54:32.541957 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Oct 11 10:54:32.542397 master-0 kubenswrapper[4790]: I1011 10:54:32.542362 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-e1bf-account-create-9qds8"] Oct 11 10:54:32.631555 master-0 kubenswrapper[4790]: I1011 10:54:32.631482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9ac8-account-create-r5rxs" event={"ID":"cdd4a60e-f24a-48fe-afcb-c7ccab615f69","Type":"ContainerDied","Data":"d6ae65f963950836b1a177c858bc761c8e099ea553721fa061ba026946ca1a96"} Oct 11 10:54:32.631555 master-0 kubenswrapper[4790]: I1011 10:54:32.631561 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ae65f963950836b1a177c858bc761c8e099ea553721fa061ba026946ca1a96" Oct 11 10:54:32.632222 master-0 kubenswrapper[4790]: I1011 10:54:32.631588 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9ac8-account-create-r5rxs" Oct 11 10:54:32.633129 master-0 kubenswrapper[4790]: I1011 10:54:32.633068 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2033-account-create-jh9gc" event={"ID":"08a325c6-b9b6-495b-87dc-d6e12b3f1029","Type":"ContainerDied","Data":"064a003115b13d4cbf140f9fcffdd93b2d44775212a0623054da53071870c839"} Oct 11 10:54:32.633196 master-0 kubenswrapper[4790]: I1011 10:54:32.633112 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2033-account-create-jh9gc" Oct 11 10:54:32.633271 master-0 kubenswrapper[4790]: I1011 10:54:32.633129 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064a003115b13d4cbf140f9fcffdd93b2d44775212a0623054da53071870c839" Oct 11 10:54:32.636120 master-0 kubenswrapper[4790]: I1011 10:54:32.636046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b634-account-create-vb2w7" event={"ID":"09ddf95f-6e9c-4f3c-b742-87379c6594b2","Type":"ContainerDied","Data":"4a0da43bdbb6cb8eab5b1d1a5b54d52710e81cb9cc902f6c2bb9187ab577a089"} Oct 11 10:54:32.636186 master-0 kubenswrapper[4790]: I1011 10:54:32.636122 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a0da43bdbb6cb8eab5b1d1a5b54d52710e81cb9cc902f6c2bb9187ab577a089" Oct 11 10:54:32.636186 master-0 kubenswrapper[4790]: I1011 10:54:32.636139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxmr\" (UniqueName: \"kubernetes.io/projected/03b3f6bf-ef4b-41fa-b098-fc5620a92300-kube-api-access-lqxmr\") pod \"ironic-e1bf-account-create-9qds8\" (UID: \"03b3f6bf-ef4b-41fa-b098-fc5620a92300\") " pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:32.636186 master-0 kubenswrapper[4790]: I1011 10:54:32.636160 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b634-account-create-vb2w7" Oct 11 10:54:32.773257 master-0 kubenswrapper[4790]: I1011 10:54:32.773076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxmr\" (UniqueName: \"kubernetes.io/projected/03b3f6bf-ef4b-41fa-b098-fc5620a92300-kube-api-access-lqxmr\") pod \"ironic-e1bf-account-create-9qds8\" (UID: \"03b3f6bf-ef4b-41fa-b098-fc5620a92300\") " pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:32.798508 master-0 kubenswrapper[4790]: I1011 10:54:32.798453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxmr\" (UniqueName: \"kubernetes.io/projected/03b3f6bf-ef4b-41fa-b098-fc5620a92300-kube-api-access-lqxmr\") pod \"ironic-e1bf-account-create-9qds8\" (UID: \"03b3f6bf-ef4b-41fa-b098-fc5620a92300\") " pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:32.869287 master-0 kubenswrapper[4790]: I1011 10:54:32.869220 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:32.928292 master-0 kubenswrapper[4790]: I1011 10:54:32.928219 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:54:32.937260 master-0 kubenswrapper[4790]: W1011 10:54:32.937179 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a5aa40_0146_4b81_83dd_761d514c557a.slice/crio-a7f963bb2db0e5602164d87267bd8833838260ebb96c568d5c14b7ec412ff1e4 WatchSource:0}: Error finding container a7f963bb2db0e5602164d87267bd8833838260ebb96c568d5c14b7ec412ff1e4: Status 404 returned error can't find the container with id a7f963bb2db0e5602164d87267bd8833838260ebb96c568d5c14b7ec412ff1e4 Oct 11 10:54:33.289918 master-0 kubenswrapper[4790]: I1011 10:54:33.289857 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-e1bf-account-create-9qds8"] Oct 11 10:54:33.309421 master-0 kubenswrapper[4790]: W1011 10:54:33.309354 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03b3f6bf_ef4b_41fa_b098_fc5620a92300.slice/crio-06d9bd3df217f4f9548cd17d34579c4352b757d706e48b231fa744d232a50671 WatchSource:0}: Error finding container 06d9bd3df217f4f9548cd17d34579c4352b757d706e48b231fa744d232a50671: Status 404 returned error can't find the container with id 06d9bd3df217f4f9548cd17d34579c4352b757d706e48b231fa744d232a50671 Oct 11 10:54:33.616685 master-0 kubenswrapper[4790]: I1011 10:54:33.614456 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b597cbbf8-mh4z2"] Oct 11 10:54:33.618895 master-0 kubenswrapper[4790]: I1011 10:54:33.617923 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.622511 master-0 kubenswrapper[4790]: I1011 10:54:33.622464 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 11 10:54:33.622511 master-0 kubenswrapper[4790]: I1011 10:54:33.622503 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 11 10:54:33.622860 master-0 kubenswrapper[4790]: I1011 10:54:33.622544 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 11 10:54:33.622860 master-0 kubenswrapper[4790]: I1011 10:54:33.622738 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 11 10:54:33.623294 master-0 kubenswrapper[4790]: I1011 10:54:33.622964 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b597cbbf8-mh4z2"] Oct 11 10:54:33.647309 master-0 kubenswrapper[4790]: I1011 10:54:33.647224 4790 generic.go:334] "Generic (PLEG): container finished" podID="03b3f6bf-ef4b-41fa-b098-fc5620a92300" containerID="f18d3e7808bc3bd6d8d3dfcffe3def526d4ab16b836ac39e9bb14dfceb0d8247" exitCode=0 Oct 11 10:54:33.647975 master-0 kubenswrapper[4790]: I1011 10:54:33.647308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-e1bf-account-create-9qds8" event={"ID":"03b3f6bf-ef4b-41fa-b098-fc5620a92300","Type":"ContainerDied","Data":"f18d3e7808bc3bd6d8d3dfcffe3def526d4ab16b836ac39e9bb14dfceb0d8247"} Oct 11 10:54:33.647975 master-0 kubenswrapper[4790]: I1011 10:54:33.647419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-e1bf-account-create-9qds8" event={"ID":"03b3f6bf-ef4b-41fa-b098-fc5620a92300","Type":"ContainerStarted","Data":"06d9bd3df217f4f9548cd17d34579c4352b757d706e48b231fa744d232a50671"} Oct 11 10:54:33.649007 master-0 kubenswrapper[4790]: I1011 10:54:33.648975 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"a0a5aa40-0146-4b81-83dd-761d514c557a","Type":"ContainerStarted","Data":"a7f963bb2db0e5602164d87267bd8833838260ebb96c568d5c14b7ec412ff1e4"} Oct 11 10:54:33.793036 master-0 kubenswrapper[4790]: I1011 10:54:33.792969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-config-data\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.793036 master-0 kubenswrapper[4790]: I1011 10:54:33.793039 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-combined-ca-bundle\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.793322 master-0 kubenswrapper[4790]: I1011 10:54:33.793099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-public-tls-certs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.793322 master-0 kubenswrapper[4790]: I1011 10:54:33.793120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7322b229-2c7a-4d99-a73b-f3612dc2670e-logs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.793416 master-0 kubenswrapper[4790]: I1011 10:54:33.793373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrvk\" (UniqueName: \"kubernetes.io/projected/7322b229-2c7a-4d99-a73b-f3612dc2670e-kube-api-access-7mrvk\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.793495 master-0 kubenswrapper[4790]: I1011 10:54:33.793444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-internal-tls-certs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.793668 master-0 kubenswrapper[4790]: I1011 10:54:33.793628 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-scripts\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895318 master-0 kubenswrapper[4790]: I1011 10:54:33.895069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrvk\" (UniqueName: \"kubernetes.io/projected/7322b229-2c7a-4d99-a73b-f3612dc2670e-kube-api-access-7mrvk\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895318 master-0 kubenswrapper[4790]: I1011 10:54:33.895140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-internal-tls-certs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895318 master-0 kubenswrapper[4790]: I1011 10:54:33.895204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-scripts\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895318 master-0 kubenswrapper[4790]: I1011 10:54:33.895270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-config-data\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895318 master-0 kubenswrapper[4790]: I1011 10:54:33.895302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-combined-ca-bundle\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895852 master-0 kubenswrapper[4790]: I1011 10:54:33.895356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-public-tls-certs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.895852 master-0 kubenswrapper[4790]: I1011 10:54:33.895381 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7322b229-2c7a-4d99-a73b-f3612dc2670e-logs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.896018 master-0 kubenswrapper[4790]: I1011 10:54:33.895889 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7322b229-2c7a-4d99-a73b-f3612dc2670e-logs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.900185 master-0 kubenswrapper[4790]: I1011 10:54:33.900135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-scripts\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.900787 master-0 kubenswrapper[4790]: I1011 10:54:33.900754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-config-data\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.901952 master-0 kubenswrapper[4790]: I1011 10:54:33.901902 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-internal-tls-certs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.902517 master-0 kubenswrapper[4790]: I1011 10:54:33.902463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-public-tls-certs\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.903900 master-0 kubenswrapper[4790]: I1011 10:54:33.903825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7322b229-2c7a-4d99-a73b-f3612dc2670e-combined-ca-bundle\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.920889 master-0 kubenswrapper[4790]: I1011 10:54:33.920833 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrvk\" (UniqueName: \"kubernetes.io/projected/7322b229-2c7a-4d99-a73b-f3612dc2670e-kube-api-access-7mrvk\") pod \"placement-6b597cbbf8-mh4z2\" (UID: \"7322b229-2c7a-4d99-a73b-f3612dc2670e\") " pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:33.968143 master-0 kubenswrapper[4790]: I1011 10:54:33.968080 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:34.414163 master-0 kubenswrapper[4790]: I1011 10:54:34.414098 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b597cbbf8-mh4z2"] Oct 11 10:54:34.658839 master-0 kubenswrapper[4790]: I1011 10:54:34.658694 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b597cbbf8-mh4z2" event={"ID":"7322b229-2c7a-4d99-a73b-f3612dc2670e","Type":"ContainerStarted","Data":"6768bfbd8eb1927a0a3fde9b51e4a592023d348c1f0cfddf53347324b56df409"} Oct 11 10:54:35.151321 master-0 kubenswrapper[4790]: I1011 10:54:35.151267 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:35.336885 master-0 kubenswrapper[4790]: I1011 10:54:35.336826 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqxmr\" (UniqueName: \"kubernetes.io/projected/03b3f6bf-ef4b-41fa-b098-fc5620a92300-kube-api-access-lqxmr\") pod \"03b3f6bf-ef4b-41fa-b098-fc5620a92300\" (UID: \"03b3f6bf-ef4b-41fa-b098-fc5620a92300\") " Oct 11 10:54:35.340218 master-0 kubenswrapper[4790]: I1011 10:54:35.340132 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b3f6bf-ef4b-41fa-b098-fc5620a92300-kube-api-access-lqxmr" (OuterVolumeSpecName: "kube-api-access-lqxmr") pod "03b3f6bf-ef4b-41fa-b098-fc5620a92300" (UID: "03b3f6bf-ef4b-41fa-b098-fc5620a92300"). InnerVolumeSpecName "kube-api-access-lqxmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:35.439149 master-0 kubenswrapper[4790]: I1011 10:54:35.439100 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqxmr\" (UniqueName: \"kubernetes.io/projected/03b3f6bf-ef4b-41fa-b098-fc5620a92300-kube-api-access-lqxmr\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:35.668487 master-0 kubenswrapper[4790]: I1011 10:54:35.668336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-e1bf-account-create-9qds8" event={"ID":"03b3f6bf-ef4b-41fa-b098-fc5620a92300","Type":"ContainerDied","Data":"06d9bd3df217f4f9548cd17d34579c4352b757d706e48b231fa744d232a50671"} Oct 11 10:54:35.668487 master-0 kubenswrapper[4790]: I1011 10:54:35.668391 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06d9bd3df217f4f9548cd17d34579c4352b757d706e48b231fa744d232a50671" Oct 11 10:54:35.668487 master-0 kubenswrapper[4790]: I1011 10:54:35.668466 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e1bf-account-create-9qds8" Oct 11 10:54:36.762930 master-0 kubenswrapper[4790]: I1011 10:54:36.761783 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-848fcbb4df-dr4lc"] Oct 11 10:54:36.762930 master-0 kubenswrapper[4790]: E1011 10:54:36.762117 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b3f6bf-ef4b-41fa-b098-fc5620a92300" containerName="mariadb-account-create" Oct 11 10:54:36.762930 master-0 kubenswrapper[4790]: I1011 10:54:36.762132 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b3f6bf-ef4b-41fa-b098-fc5620a92300" containerName="mariadb-account-create" Oct 11 10:54:36.762930 master-0 kubenswrapper[4790]: I1011 10:54:36.762284 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b3f6bf-ef4b-41fa-b098-fc5620a92300" containerName="mariadb-account-create" Oct 11 10:54:36.763781 master-0 kubenswrapper[4790]: I1011 10:54:36.763000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.765691 master-0 kubenswrapper[4790]: I1011 10:54:36.765648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 11 10:54:36.766836 master-0 kubenswrapper[4790]: I1011 10:54:36.766776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-public-tls-certs\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.766897 master-0 kubenswrapper[4790]: I1011 10:54:36.766839 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcmmz\" (UniqueName: \"kubernetes.io/projected/bc0250a9-8454-4716-8e79-36166266decb-kube-api-access-kcmmz\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.766897 master-0 kubenswrapper[4790]: I1011 10:54:36.766873 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-combined-ca-bundle\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.766970 master-0 kubenswrapper[4790]: I1011 10:54:36.766930 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-internal-tls-certs\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.766970 master-0 kubenswrapper[4790]: I1011 10:54:36.766964 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-fernet-keys\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.767034 master-0 kubenswrapper[4790]: I1011 10:54:36.766998 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-scripts\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.767139 master-0 kubenswrapper[4790]: I1011 10:54:36.767090 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-config-data\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.767139 master-0 kubenswrapper[4790]: I1011 10:54:36.767132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-credential-keys\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.768338 master-0 kubenswrapper[4790]: I1011 10:54:36.767951 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 11 10:54:36.770121 master-0 kubenswrapper[4790]: I1011 10:54:36.769606 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 11 10:54:36.770121 master-0 kubenswrapper[4790]: I1011 10:54:36.769844 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 11 10:54:36.770121 master-0 kubenswrapper[4790]: I1011 10:54:36.769965 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 11 10:54:36.780438 master-0 kubenswrapper[4790]: I1011 10:54:36.780370 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-848fcbb4df-dr4lc"] Oct 11 10:54:36.868689 master-0 kubenswrapper[4790]: I1011 10:54:36.868597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcmmz\" (UniqueName: \"kubernetes.io/projected/bc0250a9-8454-4716-8e79-36166266decb-kube-api-access-kcmmz\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.868689 master-0 kubenswrapper[4790]: I1011 10:54:36.868662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-public-tls-certs\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.868689 master-0 kubenswrapper[4790]: I1011 10:54:36.868695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-combined-ca-bundle\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.869164 master-0 kubenswrapper[4790]: I1011 10:54:36.868745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-internal-tls-certs\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.869452 master-0 kubenswrapper[4790]: I1011 10:54:36.868774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-fernet-keys\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.869527 master-0 kubenswrapper[4790]: I1011 10:54:36.869469 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-scripts\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.869527 master-0 kubenswrapper[4790]: I1011 10:54:36.869518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-config-data\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.869643 master-0 kubenswrapper[4790]: I1011 10:54:36.869546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-credential-keys\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.873308 master-0 kubenswrapper[4790]: I1011 10:54:36.873264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-public-tls-certs\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.873782 master-0 kubenswrapper[4790]: I1011 10:54:36.873746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-scripts\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.873871 master-0 kubenswrapper[4790]: I1011 10:54:36.873739 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-config-data\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.874395 master-0 kubenswrapper[4790]: I1011 10:54:36.874351 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-internal-tls-certs\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.874395 master-0 kubenswrapper[4790]: I1011 10:54:36.874386 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-fernet-keys\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.875626 master-0 kubenswrapper[4790]: I1011 10:54:36.874902 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-combined-ca-bundle\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.879269 master-0 kubenswrapper[4790]: I1011 10:54:36.879074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc0250a9-8454-4716-8e79-36166266decb-credential-keys\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:36.893325 master-0 kubenswrapper[4790]: I1011 10:54:36.893227 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcmmz\" (UniqueName: \"kubernetes.io/projected/bc0250a9-8454-4716-8e79-36166266decb-kube-api-access-kcmmz\") pod \"keystone-848fcbb4df-dr4lc\" (UID: \"bc0250a9-8454-4716-8e79-36166266decb\") " pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:37.082156 master-0 kubenswrapper[4790]: I1011 10:54:37.082072 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:42.742832 master-0 kubenswrapper[4790]: I1011 10:54:42.742768 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:43.652742 master-0 kubenswrapper[4790]: I1011 10:54:43.652672 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c99f4877f-dv8jt"] Oct 11 10:54:43.653542 master-0 kubenswrapper[4790]: I1011 10:54:43.653477 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerName="dnsmasq-dns" containerID="cri-o://381e97f0118ecd0a55897d40739e39a7fe13db1d358b9b0a4b193c8f784e5a52" gracePeriod=10 Oct 11 10:54:44.777285 master-0 kubenswrapper[4790]: I1011 10:54:44.777199 4790 generic.go:334] "Generic (PLEG): container finished" podID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerID="381e97f0118ecd0a55897d40739e39a7fe13db1d358b9b0a4b193c8f784e5a52" exitCode=0 Oct 11 10:54:44.777285 master-0 kubenswrapper[4790]: I1011 10:54:44.777260 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" event={"ID":"6a4dc537-c4a3-4538-887f-62fe3919d5f0","Type":"ContainerDied","Data":"381e97f0118ecd0a55897d40739e39a7fe13db1d358b9b0a4b193c8f784e5a52"} Oct 11 10:54:45.001691 master-0 kubenswrapper[4790]: I1011 10:54:45.001645 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:54:45.054915 master-0 kubenswrapper[4790]: I1011 10:54:45.053804 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-nb\") pod \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " Oct 11 10:54:45.054915 master-0 kubenswrapper[4790]: I1011 10:54:45.053915 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-config\") pod \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " Oct 11 10:54:45.054915 master-0 kubenswrapper[4790]: I1011 10:54:45.054086 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-dns-svc\") pod \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " Oct 11 10:54:45.054915 master-0 kubenswrapper[4790]: I1011 10:54:45.054110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjqms\" (UniqueName: \"kubernetes.io/projected/6a4dc537-c4a3-4538-887f-62fe3919d5f0-kube-api-access-kjqms\") pod \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " Oct 11 10:54:45.054915 master-0 kubenswrapper[4790]: I1011 10:54:45.054191 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-sb\") pod \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\" (UID: \"6a4dc537-c4a3-4538-887f-62fe3919d5f0\") " Oct 11 10:54:45.067043 master-0 kubenswrapper[4790]: I1011 10:54:45.065068 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4dc537-c4a3-4538-887f-62fe3919d5f0-kube-api-access-kjqms" (OuterVolumeSpecName: "kube-api-access-kjqms") pod "6a4dc537-c4a3-4538-887f-62fe3919d5f0" (UID: "6a4dc537-c4a3-4538-887f-62fe3919d5f0"). InnerVolumeSpecName "kube-api-access-kjqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:45.098810 master-0 kubenswrapper[4790]: I1011 10:54:45.098143 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a4dc537-c4a3-4538-887f-62fe3919d5f0" (UID: "6a4dc537-c4a3-4538-887f-62fe3919d5f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:54:45.102812 master-0 kubenswrapper[4790]: I1011 10:54:45.102664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-config" (OuterVolumeSpecName: "config") pod "6a4dc537-c4a3-4538-887f-62fe3919d5f0" (UID: "6a4dc537-c4a3-4538-887f-62fe3919d5f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:54:45.104162 master-0 kubenswrapper[4790]: I1011 10:54:45.104117 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a4dc537-c4a3-4538-887f-62fe3919d5f0" (UID: "6a4dc537-c4a3-4538-887f-62fe3919d5f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:54:45.104693 master-0 kubenswrapper[4790]: I1011 10:54:45.104626 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a4dc537-c4a3-4538-887f-62fe3919d5f0" (UID: "6a4dc537-c4a3-4538-887f-62fe3919d5f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:54:45.156229 master-0 kubenswrapper[4790]: I1011 10:54:45.156139 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:45.156229 master-0 kubenswrapper[4790]: I1011 10:54:45.156178 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:45.156229 master-0 kubenswrapper[4790]: I1011 10:54:45.156188 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:45.156229 master-0 kubenswrapper[4790]: I1011 10:54:45.156197 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4dc537-c4a3-4538-887f-62fe3919d5f0-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:45.156229 master-0 kubenswrapper[4790]: I1011 10:54:45.156206 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjqms\" (UniqueName: \"kubernetes.io/projected/6a4dc537-c4a3-4538-887f-62fe3919d5f0-kube-api-access-kjqms\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:45.293966 master-0 kubenswrapper[4790]: I1011 10:54:45.293515 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-848fcbb4df-dr4lc"] Oct 11 10:54:45.301739 master-0 kubenswrapper[4790]: W1011 10:54:45.301666 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc0250a9_8454_4716_8e79_36166266decb.slice/crio-393b77c3ac74fc56e55351c8f8d1ba6f73966040128fe5cd20fe4f7a6e594eb9 WatchSource:0}: Error finding container 393b77c3ac74fc56e55351c8f8d1ba6f73966040128fe5cd20fe4f7a6e594eb9: Status 404 returned error can't find the container with id 393b77c3ac74fc56e55351c8f8d1ba6f73966040128fe5cd20fe4f7a6e594eb9 Oct 11 10:54:45.789099 master-0 kubenswrapper[4790]: I1011 10:54:45.789025 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-848fcbb4df-dr4lc" event={"ID":"bc0250a9-8454-4716-8e79-36166266decb","Type":"ContainerStarted","Data":"393b77c3ac74fc56e55351c8f8d1ba6f73966040128fe5cd20fe4f7a6e594eb9"} Oct 11 10:54:45.792339 master-0 kubenswrapper[4790]: I1011 10:54:45.792282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" event={"ID":"6a4dc537-c4a3-4538-887f-62fe3919d5f0","Type":"ContainerDied","Data":"ff20f3c33adc8038a2f30426436b1b65b746807b9fb9fe4c5e10d86eebcbb3ee"} Oct 11 10:54:45.792339 master-0 kubenswrapper[4790]: I1011 10:54:45.792311 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c99f4877f-dv8jt" Oct 11 10:54:45.792439 master-0 kubenswrapper[4790]: I1011 10:54:45.792354 4790 scope.go:117] "RemoveContainer" containerID="381e97f0118ecd0a55897d40739e39a7fe13db1d358b9b0a4b193c8f784e5a52" Oct 11 10:54:45.794938 master-0 kubenswrapper[4790]: I1011 10:54:45.794892 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"a0a5aa40-0146-4b81-83dd-761d514c557a","Type":"ContainerStarted","Data":"373e10b5f3797c10bf2f289215e447ba6ede87d322784f4e67d64c872023c089"} Oct 11 10:54:45.796787 master-0 kubenswrapper[4790]: I1011 10:54:45.796629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b597cbbf8-mh4z2" event={"ID":"7322b229-2c7a-4d99-a73b-f3612dc2670e","Type":"ContainerStarted","Data":"d4fb89f8224e853d11b04ea0aba6fb48ea4aa8a4301efbc8913741f90ccd165d"} Oct 11 10:54:45.796787 master-0 kubenswrapper[4790]: I1011 10:54:45.796666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b597cbbf8-mh4z2" event={"ID":"7322b229-2c7a-4d99-a73b-f3612dc2670e","Type":"ContainerStarted","Data":"4975feadce043fc26c4f727068316dfb421e25490cb098227c3d99cafb766226"} Oct 11 10:54:45.797736 master-0 kubenswrapper[4790]: I1011 10:54:45.797683 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:45.797795 master-0 kubenswrapper[4790]: I1011 10:54:45.797745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:54:45.804823 master-0 kubenswrapper[4790]: I1011 10:54:45.804787 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"06ebc6e3-ce04-4aac-bb04-ded9662f65e3","Type":"ContainerStarted","Data":"dc786133c9de5165782bb53f073d056d7ecf1bdd9443e8b6e694874591741d06"} Oct 11 10:54:45.810562 master-0 kubenswrapper[4790]: I1011 10:54:45.810508 4790 scope.go:117] "RemoveContainer" containerID="42923cd7993a370d966eb589a3c5dfe41bcbc3a27770fa8b1538dbc31e8e9a97" Oct 11 10:54:46.427283 master-0 kubenswrapper[4790]: I1011 10:54:46.427138 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b597cbbf8-mh4z2" podStartSLOduration=3.166377405 podStartE2EDuration="13.427114031s" podCreationTimestamp="2025-10-11 10:54:33 +0000 UTC" firstStartedPulling="2025-10-11 10:54:34.422409521 +0000 UTC m=+950.976869813" lastFinishedPulling="2025-10-11 10:54:44.683146147 +0000 UTC m=+961.237606439" observedRunningTime="2025-10-11 10:54:45.998631256 +0000 UTC m=+962.553091658" watchObservedRunningTime="2025-10-11 10:54:46.427114031 +0000 UTC m=+962.981574323" Oct 11 10:54:46.431270 master-0 kubenswrapper[4790]: I1011 10:54:46.431214 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c99f4877f-dv8jt"] Oct 11 10:54:46.443977 master-0 kubenswrapper[4790]: I1011 10:54:46.443910 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c99f4877f-dv8jt"] Oct 11 10:54:46.817911 master-0 kubenswrapper[4790]: I1011 10:54:46.817841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"a0a5aa40-0146-4b81-83dd-761d514c557a","Type":"ContainerStarted","Data":"c15b47f90fa6f6a74e4ba2cf37c1852384ef4ec1575e5ae98a981dfe2894d2d7"} Oct 11 10:54:46.828327 master-0 kubenswrapper[4790]: I1011 10:54:46.828252 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b5802-default-external-api-1" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-log" containerID="cri-o://dc786133c9de5165782bb53f073d056d7ecf1bdd9443e8b6e694874591741d06" gracePeriod=30 Oct 11 10:54:46.828653 master-0 kubenswrapper[4790]: I1011 10:54:46.828623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"06ebc6e3-ce04-4aac-bb04-ded9662f65e3","Type":"ContainerStarted","Data":"1a8e071be20d3f943b3788a26f067c75ea4b4b613b2fd9732c1580727514e865"} Oct 11 10:54:46.828777 master-0 kubenswrapper[4790]: I1011 10:54:46.828721 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b5802-default-external-api-1" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-httpd" containerID="cri-o://1a8e071be20d3f943b3788a26f067c75ea4b4b613b2fd9732c1580727514e865" gracePeriod=30 Oct 11 10:54:46.944737 master-0 kubenswrapper[4790]: I1011 10:54:46.944364 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b5802-default-internal-api-2" podStartSLOduration=10.150279284 podStartE2EDuration="21.944330916s" podCreationTimestamp="2025-10-11 10:54:25 +0000 UTC" firstStartedPulling="2025-10-11 10:54:32.94340623 +0000 UTC m=+949.497866532" lastFinishedPulling="2025-10-11 10:54:44.737457872 +0000 UTC m=+961.291918164" observedRunningTime="2025-10-11 10:54:46.935623214 +0000 UTC m=+963.490083526" watchObservedRunningTime="2025-10-11 10:54:46.944330916 +0000 UTC m=+963.498791228" Oct 11 10:54:47.087796 master-0 kubenswrapper[4790]: I1011 10:54:47.087661 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b5802-default-external-api-1" podStartSLOduration=11.717082542 podStartE2EDuration="25.087632917s" podCreationTimestamp="2025-10-11 10:54:22 +0000 UTC" firstStartedPulling="2025-10-11 10:54:31.406971668 +0000 UTC m=+947.961431960" lastFinishedPulling="2025-10-11 10:54:44.777522043 +0000 UTC m=+961.331982335" observedRunningTime="2025-10-11 10:54:47.076698915 +0000 UTC m=+963.631159207" watchObservedRunningTime="2025-10-11 10:54:47.087632917 +0000 UTC m=+963.642093229" Oct 11 10:54:47.836115 master-0 kubenswrapper[4790]: I1011 10:54:47.836047 4790 generic.go:334] "Generic (PLEG): container finished" podID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerID="1a8e071be20d3f943b3788a26f067c75ea4b4b613b2fd9732c1580727514e865" exitCode=0 Oct 11 10:54:47.836115 master-0 kubenswrapper[4790]: I1011 10:54:47.836090 4790 generic.go:334] "Generic (PLEG): container finished" podID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerID="dc786133c9de5165782bb53f073d056d7ecf1bdd9443e8b6e694874591741d06" exitCode=143 Oct 11 10:54:47.837005 master-0 kubenswrapper[4790]: I1011 10:54:47.836966 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"06ebc6e3-ce04-4aac-bb04-ded9662f65e3","Type":"ContainerDied","Data":"1a8e071be20d3f943b3788a26f067c75ea4b4b613b2fd9732c1580727514e865"} Oct 11 10:54:47.837132 master-0 kubenswrapper[4790]: I1011 10:54:47.837118 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"06ebc6e3-ce04-4aac-bb04-ded9662f65e3","Type":"ContainerDied","Data":"dc786133c9de5165782bb53f073d056d7ecf1bdd9443e8b6e694874591741d06"} Oct 11 10:54:48.303868 master-0 kubenswrapper[4790]: I1011 10:54:48.302374 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" path="/var/lib/kubelet/pods/6a4dc537-c4a3-4538-887f-62fe3919d5f0/volumes" Oct 11 10:54:50.064872 master-0 kubenswrapper[4790]: I1011 10:54:50.064785 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:50.186293 master-0 kubenswrapper[4790]: I1011 10:54:50.186202 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-scripts\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-config-data\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186385 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-combined-ca-bundle\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186428 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-public-tls-certs\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186521 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-logs\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvf6\" (UniqueName: \"kubernetes.io/projected/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-kube-api-access-2wvf6\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186586 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-httpd-run\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.186784 master-0 kubenswrapper[4790]: I1011 10:54:50.186782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\" (UID: \"06ebc6e3-ce04-4aac-bb04-ded9662f65e3\") " Oct 11 10:54:50.187532 master-0 kubenswrapper[4790]: I1011 10:54:50.187442 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-logs" (OuterVolumeSpecName: "logs") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:54:50.187652 master-0 kubenswrapper[4790]: I1011 10:54:50.187496 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:54:50.191018 master-0 kubenswrapper[4790]: I1011 10:54:50.190945 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-kube-api-access-2wvf6" (OuterVolumeSpecName: "kube-api-access-2wvf6") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "kube-api-access-2wvf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:54:50.193456 master-0 kubenswrapper[4790]: I1011 10:54:50.193404 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-scripts" (OuterVolumeSpecName: "scripts") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:54:50.211596 master-0 kubenswrapper[4790]: I1011 10:54:50.211523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7" (OuterVolumeSpecName: "glance") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "pvc-c7212717-18be-4287-9071-f6f818672815". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 10:54:50.214966 master-0 kubenswrapper[4790]: I1011 10:54:50.214932 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:54:50.224671 master-0 kubenswrapper[4790]: I1011 10:54:50.224565 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-config-data" (OuterVolumeSpecName: "config-data") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:54:50.241784 master-0 kubenswrapper[4790]: I1011 10:54:50.241683 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "06ebc6e3-ce04-4aac-bb04-ded9662f65e3" (UID: "06ebc6e3-ce04-4aac-bb04-ded9662f65e3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:54:50.288268 master-0 kubenswrapper[4790]: I1011 10:54:50.288191 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.288268 master-0 kubenswrapper[4790]: I1011 10:54:50.288238 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.288268 master-0 kubenswrapper[4790]: I1011 10:54:50.288249 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.288268 master-0 kubenswrapper[4790]: I1011 10:54:50.288258 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.288268 master-0 kubenswrapper[4790]: I1011 10:54:50.288268 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvf6\" (UniqueName: \"kubernetes.io/projected/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-kube-api-access-2wvf6\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.288686 master-0 kubenswrapper[4790]: I1011 10:54:50.288303 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-httpd-run\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.288686 master-0 kubenswrapper[4790]: I1011 10:54:50.288344 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") on node \"master-0\" " Oct 11 10:54:50.288686 master-0 kubenswrapper[4790]: I1011 10:54:50.288354 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ebc6e3-ce04-4aac-bb04-ded9662f65e3-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.304208 master-0 kubenswrapper[4790]: I1011 10:54:50.304156 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 11 10:54:50.304427 master-0 kubenswrapper[4790]: I1011 10:54:50.304399 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c7212717-18be-4287-9071-f6f818672815" (UniqueName: "kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7") on node "master-0" Oct 11 10:54:50.389971 master-0 kubenswrapper[4790]: I1011 10:54:50.389879 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") on node \"master-0\" DevicePath \"\"" Oct 11 10:54:50.868000 master-0 kubenswrapper[4790]: I1011 10:54:50.867909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"06ebc6e3-ce04-4aac-bb04-ded9662f65e3","Type":"ContainerDied","Data":"ba85d8283f47fbc71a069e2badeb37c9ec3baa560e78e0e496d6945e23ef982f"} Oct 11 10:54:50.868000 master-0 kubenswrapper[4790]: I1011 10:54:50.868000 4790 scope.go:117] "RemoveContainer" containerID="1a8e071be20d3f943b3788a26f067c75ea4b4b613b2fd9732c1580727514e865" Oct 11 10:54:50.868267 master-0 kubenswrapper[4790]: I1011 10:54:50.868071 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:50.888773 master-0 kubenswrapper[4790]: I1011 10:54:50.888680 4790 scope.go:117] "RemoveContainer" containerID="dc786133c9de5165782bb53f073d056d7ecf1bdd9443e8b6e694874591741d06" Oct 11 10:54:50.951606 master-0 kubenswrapper[4790]: I1011 10:54:50.951517 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:51.007904 master-0 kubenswrapper[4790]: I1011 10:54:51.007693 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:51.092151 master-0 kubenswrapper[4790]: I1011 10:54:51.092079 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:51.092944 master-0 kubenswrapper[4790]: E1011 10:54:51.092923 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerName="init" Oct 11 10:54:51.092993 master-0 kubenswrapper[4790]: I1011 10:54:51.092946 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerName="init" Oct 11 10:54:51.092993 master-0 kubenswrapper[4790]: E1011 10:54:51.092960 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerName="dnsmasq-dns" Oct 11 10:54:51.092993 master-0 kubenswrapper[4790]: I1011 10:54:51.092970 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerName="dnsmasq-dns" Oct 11 10:54:51.093064 master-0 kubenswrapper[4790]: E1011 10:54:51.093007 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-log" Oct 11 10:54:51.093064 master-0 kubenswrapper[4790]: I1011 10:54:51.093018 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-log" Oct 11 10:54:51.093064 master-0 kubenswrapper[4790]: E1011 10:54:51.093048 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-httpd" Oct 11 10:54:51.093064 master-0 kubenswrapper[4790]: I1011 10:54:51.093057 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-httpd" Oct 11 10:54:51.093623 master-0 kubenswrapper[4790]: I1011 10:54:51.093567 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-httpd" Oct 11 10:54:51.093668 master-0 kubenswrapper[4790]: I1011 10:54:51.093626 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" containerName="glance-log" Oct 11 10:54:51.093877 master-0 kubenswrapper[4790]: I1011 10:54:51.093842 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4dc537-c4a3-4538-887f-62fe3919d5f0" containerName="dnsmasq-dns" Oct 11 10:54:51.097695 master-0 kubenswrapper[4790]: I1011 10:54:51.097666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.102141 master-0 kubenswrapper[4790]: I1011 10:54:51.102090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 11 10:54:51.103032 master-0 kubenswrapper[4790]: I1011 10:54:51.102919 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b5802-default-external-config-data" Oct 11 10:54:51.139547 master-0 kubenswrapper[4790]: I1011 10:54:51.139470 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:51.711915 master-0 kubenswrapper[4790]: I1011 10:54:51.711812 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.711915 master-0 kubenswrapper[4790]: I1011 10:54:51.711910 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.712436 master-0 kubenswrapper[4790]: I1011 10:54:51.711940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.712436 master-0 kubenswrapper[4790]: I1011 10:54:51.712101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.712436 master-0 kubenswrapper[4790]: I1011 10:54:51.712258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.712436 master-0 kubenswrapper[4790]: I1011 10:54:51.712369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbtf\" (UniqueName: \"kubernetes.io/projected/3166d4fc-8488-46dc-9d63-87dc403f66bc-kube-api-access-vpbtf\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.712436 master-0 kubenswrapper[4790]: I1011 10:54:51.712394 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.712728 master-0 kubenswrapper[4790]: I1011 10:54:51.712501 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.815815 master-0 kubenswrapper[4790]: I1011 10:54:51.815512 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.815815 master-0 kubenswrapper[4790]: I1011 10:54:51.815644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.815815 master-0 kubenswrapper[4790]: I1011 10:54:51.815700 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.815815 master-0 kubenswrapper[4790]: I1011 10:54:51.815816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.816264 master-0 kubenswrapper[4790]: I1011 10:54:51.815873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.816264 master-0 kubenswrapper[4790]: I1011 10:54:51.815937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbtf\" (UniqueName: \"kubernetes.io/projected/3166d4fc-8488-46dc-9d63-87dc403f66bc-kube-api-access-vpbtf\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.816264 master-0 kubenswrapper[4790]: I1011 10:54:51.815970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.816264 master-0 kubenswrapper[4790]: I1011 10:54:51.816022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.816647 master-0 kubenswrapper[4790]: I1011 10:54:51.816592 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.816826 master-0 kubenswrapper[4790]: I1011 10:54:51.816768 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.818346 master-0 kubenswrapper[4790]: I1011 10:54:51.818312 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:54:51.818417 master-0 kubenswrapper[4790]: I1011 10:54:51.818355 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/95ab3ea1c73b905e55aa0f0a1e574a5056ec96dde23978388ab58fbe89465472/globalmount\"" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.820865 master-0 kubenswrapper[4790]: I1011 10:54:51.820836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.822660 master-0 kubenswrapper[4790]: I1011 10:54:51.822378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.822660 master-0 kubenswrapper[4790]: I1011 10:54:51.822622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.826630 master-0 kubenswrapper[4790]: I1011 10:54:51.826587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.881221 master-0 kubenswrapper[4790]: I1011 10:54:51.881118 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-848fcbb4df-dr4lc" event={"ID":"bc0250a9-8454-4716-8e79-36166266decb","Type":"ContainerStarted","Data":"c0ce4103aad4e111bf4e5c18f994dcb6ae345b0b57188ee1b0fbafd512d4a6ee"} Oct 11 10:54:51.881508 master-0 kubenswrapper[4790]: I1011 10:54:51.881332 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:54:51.890736 master-0 kubenswrapper[4790]: I1011 10:54:51.888512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbtf\" (UniqueName: \"kubernetes.io/projected/3166d4fc-8488-46dc-9d63-87dc403f66bc-kube-api-access-vpbtf\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:51.923599 master-0 kubenswrapper[4790]: I1011 10:54:51.923176 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-848fcbb4df-dr4lc" podStartSLOduration=10.492357931 podStartE2EDuration="15.923150355s" podCreationTimestamp="2025-10-11 10:54:36 +0000 UTC" firstStartedPulling="2025-10-11 10:54:45.304455717 +0000 UTC m=+961.858916009" lastFinishedPulling="2025-10-11 10:54:50.735248131 +0000 UTC m=+967.289708433" observedRunningTime="2025-10-11 10:54:51.91396203 +0000 UTC m=+968.468422332" watchObservedRunningTime="2025-10-11 10:54:51.923150355 +0000 UTC m=+968.477610657" Oct 11 10:54:52.303611 master-0 kubenswrapper[4790]: I1011 10:54:52.303498 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ebc6e3-ce04-4aac-bb04-ded9662f65e3" path="/var/lib/kubelet/pods/06ebc6e3-ce04-4aac-bb04-ded9662f65e3/volumes" Oct 11 10:54:52.334956 master-0 kubenswrapper[4790]: I1011 10:54:52.334865 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:52.335204 master-0 kubenswrapper[4790]: I1011 10:54:52.334975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:52.371878 master-0 kubenswrapper[4790]: I1011 10:54:52.371816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:52.391453 master-0 kubenswrapper[4790]: I1011 10:54:52.391363 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:52.895419 master-0 kubenswrapper[4790]: I1011 10:54:52.895325 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:52.895419 master-0 kubenswrapper[4790]: I1011 10:54:52.895413 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:53.299791 master-0 kubenswrapper[4790]: I1011 10:54:53.295595 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:53.517698 master-0 kubenswrapper[4790]: I1011 10:54:53.517583 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:54:54.063415 master-0 kubenswrapper[4790]: I1011 10:54:54.063334 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:54:54.926637 master-0 kubenswrapper[4790]: I1011 10:54:54.926548 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"3166d4fc-8488-46dc-9d63-87dc403f66bc","Type":"ContainerStarted","Data":"1980b844ffaf3f4ae914e07f575c0c012ae970754f7fcb5bc4e59940912228a6"} Oct 11 10:54:54.926637 master-0 kubenswrapper[4790]: I1011 10:54:54.926612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"3166d4fc-8488-46dc-9d63-87dc403f66bc","Type":"ContainerStarted","Data":"f242a619a098bee9251349acb03ad40745b6b14dcdda08d9b62f04ce2b3b042e"} Oct 11 10:54:55.005844 master-0 kubenswrapper[4790]: I1011 10:54:55.005282 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:55.006087 master-0 kubenswrapper[4790]: I1011 10:54:55.005952 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 10:54:55.083968 master-0 kubenswrapper[4790]: I1011 10:54:55.083914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:54:55.943620 master-0 kubenswrapper[4790]: I1011 10:54:55.943549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"3166d4fc-8488-46dc-9d63-87dc403f66bc","Type":"ContainerStarted","Data":"b351fbb5c1ba1d71d14296c2d406aca09cc6016d89d1300e0ba6b0d00727939f"} Oct 11 10:55:03.519003 master-0 kubenswrapper[4790]: I1011 10:55:03.518865 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:03.520222 master-0 kubenswrapper[4790]: I1011 10:55:03.520106 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:03.558256 master-0 kubenswrapper[4790]: I1011 10:55:03.558170 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:03.573574 master-0 kubenswrapper[4790]: I1011 10:55:03.573503 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:03.605784 master-0 kubenswrapper[4790]: I1011 10:55:03.604115 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b5802-default-external-api-1" podStartSLOduration=12.604066403000001 podStartE2EDuration="12.604066403s" podCreationTimestamp="2025-10-11 10:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:54:55.987808926 +0000 UTC m=+972.542269228" watchObservedRunningTime="2025-10-11 10:55:03.604066403 +0000 UTC m=+980.158526705" Oct 11 10:55:04.015241 master-0 kubenswrapper[4790]: I1011 10:55:04.015156 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:04.015241 master-0 kubenswrapper[4790]: I1011 10:55:04.015245 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:04.265526 master-0 kubenswrapper[4790]: I1011 10:55:04.265370 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:04.267051 master-0 kubenswrapper[4790]: I1011 10:55:04.267019 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.274053 master-0 kubenswrapper[4790]: I1011 10:55:04.274001 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b5802-scripts" Oct 11 10:55:04.274293 master-0 kubenswrapper[4790]: I1011 10:55:04.274260 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b5802-config-data" Oct 11 10:55:04.274487 master-0 kubenswrapper[4790]: I1011 10:55:04.274464 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b5802-volume-lvm-iscsi-config-data" Oct 11 10:55:04.363276 master-0 kubenswrapper[4790]: I1011 10:55:04.363207 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:04.363276 master-0 kubenswrapper[4790]: I1011 10:55:04.363270 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c75cc6dff-n4dg7"] Oct 11 10:55:04.368619 master-0 kubenswrapper[4790]: I1011 10:55:04.368559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368799 master-0 kubenswrapper[4790]: I1011 10:55:04.368681 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-dev\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368799 master-0 kubenswrapper[4790]: I1011 10:55:04.368750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-sys\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368799 master-0 kubenswrapper[4790]: I1011 10:55:04.368775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data-custom\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368952 master-0 kubenswrapper[4790]: I1011 10:55:04.368804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-nvme\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368952 master-0 kubenswrapper[4790]: I1011 10:55:04.368829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k54k8\" (UniqueName: \"kubernetes.io/projected/60d68c10-8e1c-4a92-86f6-e2925df0f714-kube-api-access-k54k8\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368952 master-0 kubenswrapper[4790]: I1011 10:55:04.368860 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-combined-ca-bundle\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368952 master-0 kubenswrapper[4790]: I1011 10:55:04.368898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-brick\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.368952 master-0 kubenswrapper[4790]: I1011 10:55:04.368931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-run\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369151 master-0 kubenswrapper[4790]: I1011 10:55:04.368962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-machine-id\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369151 master-0 kubenswrapper[4790]: I1011 10:55:04.368992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-scripts\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369151 master-0 kubenswrapper[4790]: I1011 10:55:04.369034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-lib-modules\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369151 master-0 kubenswrapper[4790]: I1011 10:55:04.369120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369888 master-0 kubenswrapper[4790]: I1011 10:55:04.369167 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-lib-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369888 master-0 kubenswrapper[4790]: I1011 10:55:04.369196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-iscsi\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.369888 master-0 kubenswrapper[4790]: I1011 10:55:04.369631 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.373837 master-0 kubenswrapper[4790]: I1011 10:55:04.372590 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c75cc6dff-n4dg7"] Oct 11 10:55:04.377102 master-0 kubenswrapper[4790]: I1011 10:55:04.374208 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:55:04.377528 master-0 kubenswrapper[4790]: I1011 10:55:04.377456 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:55:04.377528 master-0 kubenswrapper[4790]: I1011 10:55:04.377505 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 11 10:55:04.377722 master-0 kubenswrapper[4790]: I1011 10:55:04.377671 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 10:55:04.378038 master-0 kubenswrapper[4790]: I1011 10:55:04.378000 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 10:55:04.470735 master-0 kubenswrapper[4790]: I1011 10:55:04.470680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data-custom\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471181 master-0 kubenswrapper[4790]: I1011 10:55:04.471163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-sys\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471280 master-0 kubenswrapper[4790]: I1011 10:55:04.471268 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-nvme\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471362 master-0 kubenswrapper[4790]: I1011 10:55:04.471350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k54k8\" (UniqueName: \"kubernetes.io/projected/60d68c10-8e1c-4a92-86f6-e2925df0f714-kube-api-access-k54k8\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471436 master-0 kubenswrapper[4790]: I1011 10:55:04.471421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-sb\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.471516 master-0 kubenswrapper[4790]: I1011 10:55:04.471503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-combined-ca-bundle\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471588 master-0 kubenswrapper[4790]: I1011 10:55:04.471575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-brick\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471669 master-0 kubenswrapper[4790]: I1011 10:55:04.471657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-run\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471770 master-0 kubenswrapper[4790]: I1011 10:55:04.471758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-machine-id\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471845 master-0 kubenswrapper[4790]: I1011 10:55:04.471834 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ljhl\" (UniqueName: \"kubernetes.io/projected/267b88dd-f511-44be-83eb-15e57143e363-kube-api-access-6ljhl\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.471917 master-0 kubenswrapper[4790]: I1011 10:55:04.471904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-scripts\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.471992 master-0 kubenswrapper[4790]: I1011 10:55:04.471979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-swift-storage-0\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.472079 master-0 kubenswrapper[4790]: I1011 10:55:04.472068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-lib-modules\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.472173 master-0 kubenswrapper[4790]: I1011 10:55:04.472151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-nb\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.472448 master-0 kubenswrapper[4790]: I1011 10:55:04.472434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.472529 master-0 kubenswrapper[4790]: I1011 10:55:04.472519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-lib-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.472608 master-0 kubenswrapper[4790]: I1011 10:55:04.472593 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-iscsi\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.472700 master-0 kubenswrapper[4790]: I1011 10:55:04.472689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-svc\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.472797 master-0 kubenswrapper[4790]: I1011 10:55:04.472785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.472878 master-0 kubenswrapper[4790]: I1011 10:55:04.472866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-config\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.472956 master-0 kubenswrapper[4790]: I1011 10:55:04.472944 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-dev\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.473083 master-0 kubenswrapper[4790]: I1011 10:55:04.473070 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-dev\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.473161 master-0 kubenswrapper[4790]: I1011 10:55:04.471211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-sys\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.473290 master-0 kubenswrapper[4790]: I1011 10:55:04.473276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-nvme\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.474473 master-0 kubenswrapper[4790]: I1011 10:55:04.474431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-lib-modules\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.474735 master-0 kubenswrapper[4790]: I1011 10:55:04.474691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data-custom\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.474786 master-0 kubenswrapper[4790]: I1011 10:55:04.474758 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-brick\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.474824 master-0 kubenswrapper[4790]: I1011 10:55:04.474780 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-iscsi\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.474824 master-0 kubenswrapper[4790]: I1011 10:55:04.474797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-run\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.475313 master-0 kubenswrapper[4790]: I1011 10:55:04.474827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-machine-id\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.475313 master-0 kubenswrapper[4790]: I1011 10:55:04.474896 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.475313 master-0 kubenswrapper[4790]: I1011 10:55:04.475001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-lib-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.478086 master-0 kubenswrapper[4790]: I1011 10:55:04.478065 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-combined-ca-bundle\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.479075 master-0 kubenswrapper[4790]: I1011 10:55:04.479020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-scripts\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.487734 master-0 kubenswrapper[4790]: I1011 10:55:04.479768 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.522837 master-0 kubenswrapper[4790]: I1011 10:55:04.520201 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7887b79bcd-vk5xz"] Oct 11 10:55:04.522837 master-0 kubenswrapper[4790]: I1011 10:55:04.522190 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.532560 master-0 kubenswrapper[4790]: I1011 10:55:04.532060 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 11 10:55:04.532560 master-0 kubenswrapper[4790]: I1011 10:55:04.532346 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 11 10:55:04.532560 master-0 kubenswrapper[4790]: I1011 10:55:04.532498 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 11 10:55:04.541073 master-0 kubenswrapper[4790]: I1011 10:55:04.541023 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k54k8\" (UniqueName: \"kubernetes.io/projected/60d68c10-8e1c-4a92-86f6-e2925df0f714-kube-api-access-k54k8\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.544813 master-0 kubenswrapper[4790]: I1011 10:55:04.544745 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7887b79bcd-vk5xz"] Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.574897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-httpd-config\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.574958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-combined-ca-bundle\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-config\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ljhl\" (UniqueName: \"kubernetes.io/projected/267b88dd-f511-44be-83eb-15e57143e363-kube-api-access-6ljhl\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575079 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96jc\" (UniqueName: \"kubernetes.io/projected/7739fd2d-10b5-425d-acbf-f50630f07017-kube-api-access-f96jc\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575101 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-nb\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-ovndb-tls-certs\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-svc\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575181 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-config\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575201 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-sb\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.575237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-swift-storage-0\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.577627 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-swift-storage-0\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.579322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-nb\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.579542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-svc\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.579578 master-0 kubenswrapper[4790]: I1011 10:55:04.579544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-sb\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.580507 master-0 kubenswrapper[4790]: I1011 10:55:04.580387 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-config\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.601739 master-0 kubenswrapper[4790]: I1011 10:55:04.598152 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:04.601739 master-0 kubenswrapper[4790]: I1011 10:55:04.600149 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c75cc6dff-n4dg7"] Oct 11 10:55:04.601739 master-0 kubenswrapper[4790]: E1011 10:55:04.600723 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-6ljhl], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" podUID="267b88dd-f511-44be-83eb-15e57143e363" Oct 11 10:55:04.618739 master-0 kubenswrapper[4790]: I1011 10:55:04.617757 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ljhl\" (UniqueName: \"kubernetes.io/projected/267b88dd-f511-44be-83eb-15e57143e363-kube-api-access-6ljhl\") pod \"dnsmasq-dns-7c75cc6dff-n4dg7\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:04.677864 master-0 kubenswrapper[4790]: I1011 10:55:04.676447 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-httpd-config\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.677864 master-0 kubenswrapper[4790]: I1011 10:55:04.676559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-combined-ca-bundle\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.677864 master-0 kubenswrapper[4790]: I1011 10:55:04.676676 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96jc\" (UniqueName: \"kubernetes.io/projected/7739fd2d-10b5-425d-acbf-f50630f07017-kube-api-access-f96jc\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.677864 master-0 kubenswrapper[4790]: I1011 10:55:04.676700 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-ovndb-tls-certs\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.677864 master-0 kubenswrapper[4790]: I1011 10:55:04.676765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-config\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.681768 master-0 kubenswrapper[4790]: I1011 10:55:04.680794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-httpd-config\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.681768 master-0 kubenswrapper[4790]: I1011 10:55:04.681699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-config\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.681768 master-0 kubenswrapper[4790]: I1011 10:55:04.681763 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-combined-ca-bundle\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.685744 master-0 kubenswrapper[4790]: I1011 10:55:04.685033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-ovndb-tls-certs\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.720763 master-0 kubenswrapper[4790]: I1011 10:55:04.719500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96jc\" (UniqueName: \"kubernetes.io/projected/7739fd2d-10b5-425d-acbf-f50630f07017-kube-api-access-f96jc\") pod \"neutron-7887b79bcd-vk5xz\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:04.770488 master-0 kubenswrapper[4790]: I1011 10:55:04.768903 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:04.770488 master-0 kubenswrapper[4790]: I1011 10:55:04.770255 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.774754 master-0 kubenswrapper[4790]: I1011 10:55:04.774205 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b5802-api-config-data" Oct 11 10:55:04.816800 master-0 kubenswrapper[4790]: I1011 10:55:04.805646 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:04.904755 master-0 kubenswrapper[4790]: I1011 10:55:04.904506 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c351cb-246a-4343-a56d-c74fb4be119e-logs\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905088 master-0 kubenswrapper[4790]: I1011 10:55:04.904830 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-scripts\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905088 master-0 kubenswrapper[4790]: I1011 10:55:04.904951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30c351cb-246a-4343-a56d-c74fb4be119e-etc-machine-id\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905088 master-0 kubenswrapper[4790]: I1011 10:55:04.904999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data-custom\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905088 master-0 kubenswrapper[4790]: I1011 10:55:04.905032 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xghz\" (UniqueName: \"kubernetes.io/projected/30c351cb-246a-4343-a56d-c74fb4be119e-kube-api-access-5xghz\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905282 master-0 kubenswrapper[4790]: I1011 10:55:04.905112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-combined-ca-bundle\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905282 master-0 kubenswrapper[4790]: I1011 10:55:04.905163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:04.905785 master-0 kubenswrapper[4790]: I1011 10:55:04.905730 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data-custom\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xghz\" (UniqueName: \"kubernetes.io/projected/30c351cb-246a-4343-a56d-c74fb4be119e-kube-api-access-5xghz\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-combined-ca-bundle\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c351cb-246a-4343-a56d-c74fb4be119e-logs\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007769 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-scripts\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30c351cb-246a-4343-a56d-c74fb4be119e-etc-machine-id\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.008299 master-0 kubenswrapper[4790]: I1011 10:55:05.007972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30c351cb-246a-4343-a56d-c74fb4be119e-etc-machine-id\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.010235 master-0 kubenswrapper[4790]: I1011 10:55:05.010184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c351cb-246a-4343-a56d-c74fb4be119e-logs\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.015572 master-0 kubenswrapper[4790]: I1011 10:55:05.015528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-combined-ca-bundle\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.018575 master-0 kubenswrapper[4790]: I1011 10:55:05.018548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.019530 master-0 kubenswrapper[4790]: I1011 10:55:05.019469 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data-custom\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.034953 master-0 kubenswrapper[4790]: I1011 10:55:05.025269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-scripts\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.055035 master-0 kubenswrapper[4790]: I1011 10:55:05.052892 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:05.058353 master-0 kubenswrapper[4790]: I1011 10:55:05.058032 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xghz\" (UniqueName: \"kubernetes.io/projected/30c351cb-246a-4343-a56d-c74fb4be119e-kube-api-access-5xghz\") pod \"cinder-b5802-api-1\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.075378 master-0 kubenswrapper[4790]: I1011 10:55:05.075345 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:05.095869 master-0 kubenswrapper[4790]: I1011 10:55:05.095774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:05.109243 master-0 kubenswrapper[4790]: I1011 10:55:05.109180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-config\") pod \"267b88dd-f511-44be-83eb-15e57143e363\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " Oct 11 10:55:05.109513 master-0 kubenswrapper[4790]: I1011 10:55:05.109281 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-nb\") pod \"267b88dd-f511-44be-83eb-15e57143e363\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " Oct 11 10:55:05.109513 master-0 kubenswrapper[4790]: I1011 10:55:05.109409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-sb\") pod \"267b88dd-f511-44be-83eb-15e57143e363\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " Oct 11 10:55:05.109513 master-0 kubenswrapper[4790]: I1011 10:55:05.109441 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-svc\") pod \"267b88dd-f511-44be-83eb-15e57143e363\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " Oct 11 10:55:05.109513 master-0 kubenswrapper[4790]: I1011 10:55:05.109481 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-swift-storage-0\") pod \"267b88dd-f511-44be-83eb-15e57143e363\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " Oct 11 10:55:05.109673 master-0 kubenswrapper[4790]: I1011 10:55:05.109569 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ljhl\" (UniqueName: \"kubernetes.io/projected/267b88dd-f511-44be-83eb-15e57143e363-kube-api-access-6ljhl\") pod \"267b88dd-f511-44be-83eb-15e57143e363\" (UID: \"267b88dd-f511-44be-83eb-15e57143e363\") " Oct 11 10:55:05.109842 master-0 kubenswrapper[4790]: I1011 10:55:05.109763 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-config" (OuterVolumeSpecName: "config") pod "267b88dd-f511-44be-83eb-15e57143e363" (UID: "267b88dd-f511-44be-83eb-15e57143e363"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:05.110290 master-0 kubenswrapper[4790]: I1011 10:55:05.110260 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "267b88dd-f511-44be-83eb-15e57143e363" (UID: "267b88dd-f511-44be-83eb-15e57143e363"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:05.111138 master-0 kubenswrapper[4790]: I1011 10:55:05.110747 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "267b88dd-f511-44be-83eb-15e57143e363" (UID: "267b88dd-f511-44be-83eb-15e57143e363"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:05.111138 master-0 kubenswrapper[4790]: I1011 10:55:05.110817 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "267b88dd-f511-44be-83eb-15e57143e363" (UID: "267b88dd-f511-44be-83eb-15e57143e363"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:05.111502 master-0 kubenswrapper[4790]: I1011 10:55:05.111428 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "267b88dd-f511-44be-83eb-15e57143e363" (UID: "267b88dd-f511-44be-83eb-15e57143e363"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:05.113327 master-0 kubenswrapper[4790]: I1011 10:55:05.113299 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:05.113465 master-0 kubenswrapper[4790]: I1011 10:55:05.113450 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:05.113577 master-0 kubenswrapper[4790]: I1011 10:55:05.113562 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:05.113696 master-0 kubenswrapper[4790]: I1011 10:55:05.113677 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:05.113832 master-0 kubenswrapper[4790]: I1011 10:55:05.113817 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267b88dd-f511-44be-83eb-15e57143e363-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:05.116041 master-0 kubenswrapper[4790]: I1011 10:55:05.115969 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267b88dd-f511-44be-83eb-15e57143e363-kube-api-access-6ljhl" (OuterVolumeSpecName: "kube-api-access-6ljhl") pod "267b88dd-f511-44be-83eb-15e57143e363" (UID: "267b88dd-f511-44be-83eb-15e57143e363"). InnerVolumeSpecName "kube-api-access-6ljhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:05.161632 master-0 kubenswrapper[4790]: I1011 10:55:05.161568 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:05.181123 master-0 kubenswrapper[4790]: W1011 10:55:05.181006 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60d68c10_8e1c_4a92_86f6_e2925df0f714.slice/crio-3242bfe4a3531200a62ec255f320d35782117671872a6959f988bac26f9f7d3f WatchSource:0}: Error finding container 3242bfe4a3531200a62ec255f320d35782117671872a6959f988bac26f9f7d3f: Status 404 returned error can't find the container with id 3242bfe4a3531200a62ec255f320d35782117671872a6959f988bac26f9f7d3f Oct 11 10:55:05.217312 master-0 kubenswrapper[4790]: I1011 10:55:05.217257 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ljhl\" (UniqueName: \"kubernetes.io/projected/267b88dd-f511-44be-83eb-15e57143e363-kube-api-access-6ljhl\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:05.255491 master-0 kubenswrapper[4790]: I1011 10:55:05.255047 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:55:05.255491 master-0 kubenswrapper[4790]: I1011 10:55:05.255128 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b597cbbf8-mh4z2" Oct 11 10:55:05.553061 master-0 kubenswrapper[4790]: I1011 10:55:05.552576 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7887b79bcd-vk5xz"] Oct 11 10:55:05.558723 master-0 kubenswrapper[4790]: I1011 10:55:05.557845 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:05.572061 master-0 kubenswrapper[4790]: W1011 10:55:05.571291 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7739fd2d_10b5_425d_acbf_f50630f07017.slice/crio-bff1b3163e12e8cb39a3fbc83d1e97e1c30281603df9047eeb4b5e8ea878efe5 WatchSource:0}: Error finding container bff1b3163e12e8cb39a3fbc83d1e97e1c30281603df9047eeb4b5e8ea878efe5: Status 404 returned error can't find the container with id bff1b3163e12e8cb39a3fbc83d1e97e1c30281603df9047eeb4b5e8ea878efe5 Oct 11 10:55:06.075779 master-0 kubenswrapper[4790]: I1011 10:55:06.075607 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"30c351cb-246a-4343-a56d-c74fb4be119e","Type":"ContainerStarted","Data":"72c3b882472f96be141ddefa786998e8b0390cd596d77062abb0fcaa4a2d580f"} Oct 11 10:55:06.077964 master-0 kubenswrapper[4790]: I1011 10:55:06.077915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7887b79bcd-vk5xz" event={"ID":"7739fd2d-10b5-425d-acbf-f50630f07017","Type":"ContainerStarted","Data":"2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e"} Oct 11 10:55:06.078052 master-0 kubenswrapper[4790]: I1011 10:55:06.077965 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7887b79bcd-vk5xz" event={"ID":"7739fd2d-10b5-425d-acbf-f50630f07017","Type":"ContainerStarted","Data":"477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe"} Oct 11 10:55:06.078052 master-0 kubenswrapper[4790]: I1011 10:55:06.077980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7887b79bcd-vk5xz" event={"ID":"7739fd2d-10b5-425d-acbf-f50630f07017","Type":"ContainerStarted","Data":"bff1b3163e12e8cb39a3fbc83d1e97e1c30281603df9047eeb4b5e8ea878efe5"} Oct 11 10:55:06.078422 master-0 kubenswrapper[4790]: I1011 10:55:06.078360 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:06.079494 master-0 kubenswrapper[4790]: I1011 10:55:06.079453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c75cc6dff-n4dg7" Oct 11 10:55:06.080194 master-0 kubenswrapper[4790]: I1011 10:55:06.080144 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"60d68c10-8e1c-4a92-86f6-e2925df0f714","Type":"ContainerStarted","Data":"3242bfe4a3531200a62ec255f320d35782117671872a6959f988bac26f9f7d3f"} Oct 11 10:55:06.108328 master-0 kubenswrapper[4790]: I1011 10:55:06.108110 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7887b79bcd-vk5xz" podStartSLOduration=2.108072935 podStartE2EDuration="2.108072935s" podCreationTimestamp="2025-10-11 10:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:55:06.104085726 +0000 UTC m=+982.658546028" watchObservedRunningTime="2025-10-11 10:55:06.108072935 +0000 UTC m=+982.662533247" Oct 11 10:55:06.183561 master-0 kubenswrapper[4790]: I1011 10:55:06.183514 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c75cc6dff-n4dg7"] Oct 11 10:55:06.191152 master-0 kubenswrapper[4790]: I1011 10:55:06.191087 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c75cc6dff-n4dg7"] Oct 11 10:55:06.315454 master-0 kubenswrapper[4790]: I1011 10:55:06.315369 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267b88dd-f511-44be-83eb-15e57143e363" path="/var/lib/kubelet/pods/267b88dd-f511-44be-83eb-15e57143e363/volumes" Oct 11 10:55:06.412671 master-0 kubenswrapper[4790]: I1011 10:55:06.412508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:06.412671 master-0 kubenswrapper[4790]: I1011 10:55:06.412660 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 10:55:06.501745 master-0 kubenswrapper[4790]: I1011 10:55:06.501611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:55:08.890489 master-0 kubenswrapper[4790]: I1011 10:55:08.890425 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-848fcbb4df-dr4lc" Oct 11 10:55:15.213150 master-0 kubenswrapper[4790]: I1011 10:55:15.212973 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb968bd67-mk4sp"] Oct 11 10:55:15.219284 master-0 kubenswrapper[4790]: I1011 10:55:15.219247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.223628 master-0 kubenswrapper[4790]: I1011 10:55:15.223570 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 10:55:15.223775 master-0 kubenswrapper[4790]: I1011 10:55:15.223601 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:55:15.223775 master-0 kubenswrapper[4790]: I1011 10:55:15.223605 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:55:15.223865 master-0 kubenswrapper[4790]: I1011 10:55:15.223811 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 11 10:55:15.228213 master-0 kubenswrapper[4790]: I1011 10:55:15.227652 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 10:55:15.256560 master-0 kubenswrapper[4790]: I1011 10:55:15.248799 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb968bd67-mk4sp"] Oct 11 10:55:15.308096 master-0 kubenswrapper[4790]: I1011 10:55:15.308011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-sb\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.309042 master-0 kubenswrapper[4790]: I1011 10:55:15.308495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-nb\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.309042 master-0 kubenswrapper[4790]: I1011 10:55:15.308591 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tbm5\" (UniqueName: \"kubernetes.io/projected/82f87bf9-c348-4149-a72c-99e49db4ec09-kube-api-access-2tbm5\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.309042 master-0 kubenswrapper[4790]: I1011 10:55:15.308625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-config\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.309042 master-0 kubenswrapper[4790]: I1011 10:55:15.308672 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-svc\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.309042 master-0 kubenswrapper[4790]: I1011 10:55:15.308762 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-swift-storage-0\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.411352 master-0 kubenswrapper[4790]: I1011 10:55:15.411284 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-nb\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.411352 master-0 kubenswrapper[4790]: I1011 10:55:15.411354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tbm5\" (UniqueName: \"kubernetes.io/projected/82f87bf9-c348-4149-a72c-99e49db4ec09-kube-api-access-2tbm5\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.411649 master-0 kubenswrapper[4790]: I1011 10:55:15.411386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-config\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.411649 master-0 kubenswrapper[4790]: I1011 10:55:15.411414 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-svc\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.411649 master-0 kubenswrapper[4790]: I1011 10:55:15.411461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-swift-storage-0\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.411649 master-0 kubenswrapper[4790]: I1011 10:55:15.411561 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-sb\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.412787 master-0 kubenswrapper[4790]: I1011 10:55:15.412740 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-sb\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.413466 master-0 kubenswrapper[4790]: I1011 10:55:15.413431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-nb\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.414523 master-0 kubenswrapper[4790]: I1011 10:55:15.414481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-config\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.414671 master-0 kubenswrapper[4790]: I1011 10:55:15.414641 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-svc\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.417308 master-0 kubenswrapper[4790]: I1011 10:55:15.415298 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-swift-storage-0\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.516049 master-0 kubenswrapper[4790]: I1011 10:55:15.515917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tbm5\" (UniqueName: \"kubernetes.io/projected/82f87bf9-c348-4149-a72c-99e49db4ec09-kube-api-access-2tbm5\") pod \"dnsmasq-dns-bb968bd67-mk4sp\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:15.615874 master-0 kubenswrapper[4790]: I1011 10:55:15.615792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:16.061640 master-0 kubenswrapper[4790]: I1011 10:55:16.061572 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb968bd67-mk4sp"] Oct 11 10:55:16.206286 master-0 kubenswrapper[4790]: I1011 10:55:16.206214 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f5c98dcd5-c8xhk"] Oct 11 10:55:16.208340 master-0 kubenswrapper[4790]: I1011 10:55:16.207997 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.216097 master-0 kubenswrapper[4790]: I1011 10:55:16.211886 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 11 10:55:16.216097 master-0 kubenswrapper[4790]: I1011 10:55:16.214222 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 11 10:55:16.220277 master-0 kubenswrapper[4790]: I1011 10:55:16.220194 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f5c98dcd5-c8xhk"] Oct 11 10:55:16.238608 master-0 kubenswrapper[4790]: I1011 10:55:16.238490 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-combined-ca-bundle\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.238902 master-0 kubenswrapper[4790]: I1011 10:55:16.238650 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data-custom\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.240218 master-0 kubenswrapper[4790]: I1011 10:55:16.239997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.240218 master-0 kubenswrapper[4790]: I1011 10:55:16.240108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8r8\" (UniqueName: \"kubernetes.io/projected/c41b60ff-457d-4c45-8b56-4523c5c0097f-kube-api-access-bb8r8\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.341883 master-0 kubenswrapper[4790]: I1011 10:55:16.341789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8r8\" (UniqueName: \"kubernetes.io/projected/c41b60ff-457d-4c45-8b56-4523c5c0097f-kube-api-access-bb8r8\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.342112 master-0 kubenswrapper[4790]: I1011 10:55:16.341898 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-combined-ca-bundle\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.342112 master-0 kubenswrapper[4790]: I1011 10:55:16.341964 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data-custom\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.342112 master-0 kubenswrapper[4790]: I1011 10:55:16.342075 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.347299 master-0 kubenswrapper[4790]: I1011 10:55:16.347210 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-combined-ca-bundle\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.348586 master-0 kubenswrapper[4790]: I1011 10:55:16.348528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.348670 master-0 kubenswrapper[4790]: I1011 10:55:16.348593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data-custom\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.366856 master-0 kubenswrapper[4790]: I1011 10:55:16.366732 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8r8\" (UniqueName: \"kubernetes.io/projected/c41b60ff-457d-4c45-8b56-4523c5c0097f-kube-api-access-bb8r8\") pod \"heat-api-5f5c98dcd5-c8xhk\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:16.547684 master-0 kubenswrapper[4790]: I1011 10:55:16.544988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:25.058863 master-0 kubenswrapper[4790]: W1011 10:55:25.058781 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82f87bf9_c348_4149_a72c_99e49db4ec09.slice/crio-792d4dacd71ae32e22ff517532decce940e406c6a15096335b14e8b534a2992c WatchSource:0}: Error finding container 792d4dacd71ae32e22ff517532decce940e406c6a15096335b14e8b534a2992c: Status 404 returned error can't find the container with id 792d4dacd71ae32e22ff517532decce940e406c6a15096335b14e8b534a2992c Oct 11 10:55:25.061592 master-0 kubenswrapper[4790]: I1011 10:55:25.061529 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb968bd67-mk4sp"] Oct 11 10:55:25.206805 master-0 kubenswrapper[4790]: I1011 10:55:25.205752 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f5c98dcd5-c8xhk"] Oct 11 10:55:25.295480 master-0 kubenswrapper[4790]: I1011 10:55:25.295339 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" event={"ID":"82f87bf9-c348-4149-a72c-99e49db4ec09","Type":"ContainerStarted","Data":"792d4dacd71ae32e22ff517532decce940e406c6a15096335b14e8b534a2992c"} Oct 11 10:55:25.299741 master-0 kubenswrapper[4790]: I1011 10:55:25.298468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"60d68c10-8e1c-4a92-86f6-e2925df0f714","Type":"ContainerStarted","Data":"e192e2becd115690c1d668bc7d78fc93266390a2fb95e6340eaf7c60f1f198e9"} Oct 11 10:55:25.299741 master-0 kubenswrapper[4790]: I1011 10:55:25.298513 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"60d68c10-8e1c-4a92-86f6-e2925df0f714","Type":"ContainerStarted","Data":"a9ec38cb1d58ff96f557dd1efeffa25f09991c28ad10ce26a6daa40b2d7694a0"} Oct 11 10:55:25.300360 master-0 kubenswrapper[4790]: I1011 10:55:25.300239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f5c98dcd5-c8xhk" event={"ID":"c41b60ff-457d-4c45-8b56-4523c5c0097f","Type":"ContainerStarted","Data":"c3de8b5a223cd729b6c034e2228715eb85d26de982d08c60dc47229ac7d1b110"} Oct 11 10:55:25.373928 master-0 kubenswrapper[4790]: I1011 10:55:25.373243 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" podStartSLOduration=1.8724007299999998 podStartE2EDuration="21.373215811s" podCreationTimestamp="2025-10-11 10:55:04 +0000 UTC" firstStartedPulling="2025-10-11 10:55:05.185125106 +0000 UTC m=+981.739585398" lastFinishedPulling="2025-10-11 10:55:24.685940187 +0000 UTC m=+1001.240400479" observedRunningTime="2025-10-11 10:55:25.372858921 +0000 UTC m=+1001.927319223" watchObservedRunningTime="2025-10-11 10:55:25.373215811 +0000 UTC m=+1001.927676103" Oct 11 10:55:25.431523 master-0 kubenswrapper[4790]: I1011 10:55:25.431409 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-89f9b4488-8vvt9"] Oct 11 10:55:25.433201 master-0 kubenswrapper[4790]: I1011 10:55:25.433169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.441772 master-0 kubenswrapper[4790]: I1011 10:55:25.437198 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 11 10:55:25.456803 master-0 kubenswrapper[4790]: I1011 10:55:25.451833 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-89f9b4488-8vvt9"] Oct 11 10:55:25.564979 master-0 kubenswrapper[4790]: I1011 10:55:25.564902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.564979 master-0 kubenswrapper[4790]: I1011 10:55:25.564968 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data-custom\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.565476 master-0 kubenswrapper[4790]: I1011 10:55:25.565065 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-combined-ca-bundle\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.565476 master-0 kubenswrapper[4790]: I1011 10:55:25.565102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxk4p\" (UniqueName: \"kubernetes.io/projected/90ca8fc6-bc53-461b-8384-ca8344e8abb1-kube-api-access-lxk4p\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.667329 master-0 kubenswrapper[4790]: I1011 10:55:25.667237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxk4p\" (UniqueName: \"kubernetes.io/projected/90ca8fc6-bc53-461b-8384-ca8344e8abb1-kube-api-access-lxk4p\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.667562 master-0 kubenswrapper[4790]: I1011 10:55:25.667419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.667562 master-0 kubenswrapper[4790]: I1011 10:55:25.667449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data-custom\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.667562 master-0 kubenswrapper[4790]: I1011 10:55:25.667475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-combined-ca-bundle\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.671906 master-0 kubenswrapper[4790]: I1011 10:55:25.671853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-combined-ca-bundle\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.673302 master-0 kubenswrapper[4790]: I1011 10:55:25.673270 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.674166 master-0 kubenswrapper[4790]: I1011 10:55:25.674108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data-custom\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.690977 master-0 kubenswrapper[4790]: I1011 10:55:25.689470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxk4p\" (UniqueName: \"kubernetes.io/projected/90ca8fc6-bc53-461b-8384-ca8344e8abb1-kube-api-access-lxk4p\") pod \"heat-cfnapi-89f9b4488-8vvt9\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:25.773056 master-0 kubenswrapper[4790]: I1011 10:55:25.771669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:26.096507 master-0 kubenswrapper[4790]: I1011 10:55:26.096423 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-r9jnj"] Oct 11 10:55:26.098606 master-0 kubenswrapper[4790]: I1011 10:55:26.097903 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:26.117598 master-0 kubenswrapper[4790]: I1011 10:55:26.116553 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r9jnj"] Oct 11 10:55:26.195822 master-0 kubenswrapper[4790]: I1011 10:55:26.195673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcmpk\" (UniqueName: \"kubernetes.io/projected/5b7ae2a3-6802-400c-bbe7-5729052a2c1c-kube-api-access-kcmpk\") pod \"nova-api-db-create-r9jnj\" (UID: \"5b7ae2a3-6802-400c-bbe7-5729052a2c1c\") " pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:26.206294 master-0 kubenswrapper[4790]: I1011 10:55:26.206230 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rgsq2"] Oct 11 10:55:26.210436 master-0 kubenswrapper[4790]: I1011 10:55:26.209812 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:26.226142 master-0 kubenswrapper[4790]: I1011 10:55:26.226039 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rgsq2"] Oct 11 10:55:26.281698 master-0 kubenswrapper[4790]: I1011 10:55:26.281631 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-89f9b4488-8vvt9"] Oct 11 10:55:26.297191 master-0 kubenswrapper[4790]: I1011 10:55:26.297116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcmpk\" (UniqueName: \"kubernetes.io/projected/5b7ae2a3-6802-400c-bbe7-5729052a2c1c-kube-api-access-kcmpk\") pod \"nova-api-db-create-r9jnj\" (UID: \"5b7ae2a3-6802-400c-bbe7-5729052a2c1c\") " pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:26.297288 master-0 kubenswrapper[4790]: I1011 10:55:26.297241 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7jm\" (UniqueName: \"kubernetes.io/projected/7bb6dd47-2665-4a3f-8773-2a61034146a3-kube-api-access-5m7jm\") pod \"nova-cell0-db-create-rgsq2\" (UID: \"7bb6dd47-2665-4a3f-8773-2a61034146a3\") " pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:26.317280 master-0 kubenswrapper[4790]: I1011 10:55:26.317217 4790 generic.go:334] "Generic (PLEG): container finished" podID="82f87bf9-c348-4149-a72c-99e49db4ec09" containerID="2b846ca497c34f678163324ba082b2f539146a51cf7170ec086171931a25f9ba" exitCode=0 Oct 11 10:55:26.317534 master-0 kubenswrapper[4790]: I1011 10:55:26.317473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" event={"ID":"82f87bf9-c348-4149-a72c-99e49db4ec09","Type":"ContainerDied","Data":"2b846ca497c34f678163324ba082b2f539146a51cf7170ec086171931a25f9ba"} Oct 11 10:55:26.327306 master-0 kubenswrapper[4790]: I1011 10:55:26.319484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcmpk\" (UniqueName: \"kubernetes.io/projected/5b7ae2a3-6802-400c-bbe7-5729052a2c1c-kube-api-access-kcmpk\") pod \"nova-api-db-create-r9jnj\" (UID: \"5b7ae2a3-6802-400c-bbe7-5729052a2c1c\") " pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:26.327306 master-0 kubenswrapper[4790]: I1011 10:55:26.321815 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"30c351cb-246a-4343-a56d-c74fb4be119e","Type":"ContainerStarted","Data":"f0083e208a574369bee7ac5b619444ddb040559a7e6adc8699675b4802ed4ee1"} Oct 11 10:55:26.327306 master-0 kubenswrapper[4790]: I1011 10:55:26.321876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"30c351cb-246a-4343-a56d-c74fb4be119e","Type":"ContainerStarted","Data":"60be7f28501f150423ff01b582dcc8cfc5bab82c1c195c3a36e327493870a191"} Oct 11 10:55:26.413502 master-0 kubenswrapper[4790]: I1011 10:55:26.413086 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7jm\" (UniqueName: \"kubernetes.io/projected/7bb6dd47-2665-4a3f-8773-2a61034146a3-kube-api-access-5m7jm\") pod \"nova-cell0-db-create-rgsq2\" (UID: \"7bb6dd47-2665-4a3f-8773-2a61034146a3\") " pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:26.418083 master-0 kubenswrapper[4790]: I1011 10:55:26.417989 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b5802-api-1" podStartSLOduration=3.3422582370000002 podStartE2EDuration="22.417942767s" podCreationTimestamp="2025-10-11 10:55:04 +0000 UTC" firstStartedPulling="2025-10-11 10:55:05.582805523 +0000 UTC m=+982.137265815" lastFinishedPulling="2025-10-11 10:55:24.658490053 +0000 UTC m=+1001.212950345" observedRunningTime="2025-10-11 10:55:26.396274361 +0000 UTC m=+1002.950734673" watchObservedRunningTime="2025-10-11 10:55:26.417942767 +0000 UTC m=+1002.972422549" Oct 11 10:55:26.428914 master-0 kubenswrapper[4790]: I1011 10:55:26.428063 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:26.459837 master-0 kubenswrapper[4790]: I1011 10:55:26.454410 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nw6gg"] Oct 11 10:55:26.472999 master-0 kubenswrapper[4790]: I1011 10:55:26.472932 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nw6gg"] Oct 11 10:55:26.473122 master-0 kubenswrapper[4790]: I1011 10:55:26.473091 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:26.489120 master-0 kubenswrapper[4790]: I1011 10:55:26.489077 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7jm\" (UniqueName: \"kubernetes.io/projected/7bb6dd47-2665-4a3f-8773-2a61034146a3-kube-api-access-5m7jm\") pod \"nova-cell0-db-create-rgsq2\" (UID: \"7bb6dd47-2665-4a3f-8773-2a61034146a3\") " pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:26.572842 master-0 kubenswrapper[4790]: I1011 10:55:26.572218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:26.632369 master-0 kubenswrapper[4790]: I1011 10:55:26.627136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w4cd\" (UniqueName: \"kubernetes.io/projected/8b0929b8-354d-4de6-9e2d-ac6e11324b10-kube-api-access-7w4cd\") pod \"nova-cell1-db-create-nw6gg\" (UID: \"8b0929b8-354d-4de6-9e2d-ac6e11324b10\") " pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:26.729803 master-0 kubenswrapper[4790]: I1011 10:55:26.729612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w4cd\" (UniqueName: \"kubernetes.io/projected/8b0929b8-354d-4de6-9e2d-ac6e11324b10-kube-api-access-7w4cd\") pod \"nova-cell1-db-create-nw6gg\" (UID: \"8b0929b8-354d-4de6-9e2d-ac6e11324b10\") " pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:26.763223 master-0 kubenswrapper[4790]: I1011 10:55:26.763177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w4cd\" (UniqueName: \"kubernetes.io/projected/8b0929b8-354d-4de6-9e2d-ac6e11324b10-kube-api-access-7w4cd\") pod \"nova-cell1-db-create-nw6gg\" (UID: \"8b0929b8-354d-4de6-9e2d-ac6e11324b10\") " pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:26.812684 master-0 kubenswrapper[4790]: I1011 10:55:26.810542 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:26.909295 master-0 kubenswrapper[4790]: I1011 10:55:26.908904 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:27.044394 master-0 kubenswrapper[4790]: I1011 10:55:27.044352 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-svc\") pod \"82f87bf9-c348-4149-a72c-99e49db4ec09\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " Oct 11 10:55:27.044394 master-0 kubenswrapper[4790]: I1011 10:55:27.044388 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-nb\") pod \"82f87bf9-c348-4149-a72c-99e49db4ec09\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " Oct 11 10:55:27.044517 master-0 kubenswrapper[4790]: I1011 10:55:27.044427 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-sb\") pod \"82f87bf9-c348-4149-a72c-99e49db4ec09\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " Oct 11 10:55:27.044517 master-0 kubenswrapper[4790]: I1011 10:55:27.044486 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tbm5\" (UniqueName: \"kubernetes.io/projected/82f87bf9-c348-4149-a72c-99e49db4ec09-kube-api-access-2tbm5\") pod \"82f87bf9-c348-4149-a72c-99e49db4ec09\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " Oct 11 10:55:27.044589 master-0 kubenswrapper[4790]: I1011 10:55:27.044546 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-config\") pod \"82f87bf9-c348-4149-a72c-99e49db4ec09\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " Oct 11 10:55:27.044620 master-0 kubenswrapper[4790]: I1011 10:55:27.044597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-swift-storage-0\") pod \"82f87bf9-c348-4149-a72c-99e49db4ec09\" (UID: \"82f87bf9-c348-4149-a72c-99e49db4ec09\") " Oct 11 10:55:27.048610 master-0 kubenswrapper[4790]: I1011 10:55:27.048538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f87bf9-c348-4149-a72c-99e49db4ec09-kube-api-access-2tbm5" (OuterVolumeSpecName: "kube-api-access-2tbm5") pod "82f87bf9-c348-4149-a72c-99e49db4ec09" (UID: "82f87bf9-c348-4149-a72c-99e49db4ec09"). InnerVolumeSpecName "kube-api-access-2tbm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:27.060660 master-0 kubenswrapper[4790]: I1011 10:55:27.060609 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r9jnj"] Oct 11 10:55:27.077766 master-0 kubenswrapper[4790]: I1011 10:55:27.077661 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82f87bf9-c348-4149-a72c-99e49db4ec09" (UID: "82f87bf9-c348-4149-a72c-99e49db4ec09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:27.095834 master-0 kubenswrapper[4790]: I1011 10:55:27.092132 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82f87bf9-c348-4149-a72c-99e49db4ec09" (UID: "82f87bf9-c348-4149-a72c-99e49db4ec09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:27.097737 master-0 kubenswrapper[4790]: I1011 10:55:27.096788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82f87bf9-c348-4149-a72c-99e49db4ec09" (UID: "82f87bf9-c348-4149-a72c-99e49db4ec09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:27.097737 master-0 kubenswrapper[4790]: I1011 10:55:27.097311 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f5c98dcd5-c8xhk"] Oct 11 10:55:27.117395 master-0 kubenswrapper[4790]: I1011 10:55:27.117337 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82f87bf9-c348-4149-a72c-99e49db4ec09" (UID: "82f87bf9-c348-4149-a72c-99e49db4ec09"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:27.135237 master-0 kubenswrapper[4790]: I1011 10:55:27.135039 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-config" (OuterVolumeSpecName: "config") pod "82f87bf9-c348-4149-a72c-99e49db4ec09" (UID: "82f87bf9-c348-4149-a72c-99e49db4ec09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:55:27.148886 master-0 kubenswrapper[4790]: I1011 10:55:27.148178 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:27.148886 master-0 kubenswrapper[4790]: I1011 10:55:27.148419 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:27.148886 master-0 kubenswrapper[4790]: I1011 10:55:27.148436 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:27.148886 master-0 kubenswrapper[4790]: I1011 10:55:27.148452 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:27.148886 master-0 kubenswrapper[4790]: I1011 10:55:27.148466 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f87bf9-c348-4149-a72c-99e49db4ec09-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:27.148886 master-0 kubenswrapper[4790]: I1011 10:55:27.148476 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tbm5\" (UniqueName: \"kubernetes.io/projected/82f87bf9-c348-4149-a72c-99e49db4ec09-kube-api-access-2tbm5\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:27.234165 master-0 kubenswrapper[4790]: W1011 10:55:27.221424 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bb6dd47_2665_4a3f_8773_2a61034146a3.slice/crio-159cf359ecee116df9e95cf785829a9f2d2d271b6a361ea0f6a5198dc8a786f3 WatchSource:0}: Error finding container 159cf359ecee116df9e95cf785829a9f2d2d271b6a361ea0f6a5198dc8a786f3: Status 404 returned error can't find the container with id 159cf359ecee116df9e95cf785829a9f2d2d271b6a361ea0f6a5198dc8a786f3 Oct 11 10:55:27.234165 master-0 kubenswrapper[4790]: I1011 10:55:27.223770 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rgsq2"] Oct 11 10:55:27.341087 master-0 kubenswrapper[4790]: I1011 10:55:27.339038 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rgsq2" event={"ID":"7bb6dd47-2665-4a3f-8773-2a61034146a3","Type":"ContainerStarted","Data":"159cf359ecee116df9e95cf785829a9f2d2d271b6a361ea0f6a5198dc8a786f3"} Oct 11 10:55:27.341087 master-0 kubenswrapper[4790]: I1011 10:55:27.341005 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r9jnj" event={"ID":"5b7ae2a3-6802-400c-bbe7-5729052a2c1c","Type":"ContainerStarted","Data":"187ba140386fe1bdb57e2319039e5e08987bfe52015b52422eac9cff8a82d276"} Oct 11 10:55:27.341087 master-0 kubenswrapper[4790]: I1011 10:55:27.341031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r9jnj" event={"ID":"5b7ae2a3-6802-400c-bbe7-5729052a2c1c","Type":"ContainerStarted","Data":"4074522a220cfcc13675ec89ed6e6addf00a02c34d3e5d2f86e64b3f545d3cad"} Oct 11 10:55:27.343578 master-0 kubenswrapper[4790]: I1011 10:55:27.343534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" event={"ID":"90ca8fc6-bc53-461b-8384-ca8344e8abb1","Type":"ContainerStarted","Data":"2889f39d4725531c10441fa9236d4ba817fb73083c92ada0288c6f7dfdb54987"} Oct 11 10:55:27.351006 master-0 kubenswrapper[4790]: I1011 10:55:27.350965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" Oct 11 10:55:27.352137 master-0 kubenswrapper[4790]: I1011 10:55:27.352072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb968bd67-mk4sp" event={"ID":"82f87bf9-c348-4149-a72c-99e49db4ec09","Type":"ContainerDied","Data":"792d4dacd71ae32e22ff517532decce940e406c6a15096335b14e8b534a2992c"} Oct 11 10:55:27.352137 master-0 kubenswrapper[4790]: I1011 10:55:27.352115 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:27.352228 master-0 kubenswrapper[4790]: I1011 10:55:27.352150 4790 scope.go:117] "RemoveContainer" containerID="2b846ca497c34f678163324ba082b2f539146a51cf7170ec086171931a25f9ba" Oct 11 10:55:27.370888 master-0 kubenswrapper[4790]: I1011 10:55:27.370798 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nw6gg"] Oct 11 10:55:27.378232 master-0 kubenswrapper[4790]: I1011 10:55:27.377186 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-r9jnj" podStartSLOduration=1.377164633 podStartE2EDuration="1.377164633s" podCreationTimestamp="2025-10-11 10:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:55:27.366290145 +0000 UTC m=+1003.920750457" watchObservedRunningTime="2025-10-11 10:55:27.377164633 +0000 UTC m=+1003.931624925" Oct 11 10:55:27.388724 master-0 kubenswrapper[4790]: W1011 10:55:27.388488 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b0929b8_354d_4de6_9e2d_ac6e11324b10.slice/crio-99f365cb76e3f76cb36494306ddb4f6a8b5e7332210b71ce165a6676119b559f WatchSource:0}: Error finding container 99f365cb76e3f76cb36494306ddb4f6a8b5e7332210b71ce165a6676119b559f: Status 404 returned error can't find the container with id 99f365cb76e3f76cb36494306ddb4f6a8b5e7332210b71ce165a6676119b559f Oct 11 10:55:27.452447 master-0 kubenswrapper[4790]: I1011 10:55:27.452294 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb968bd67-mk4sp"] Oct 11 10:55:27.457399 master-0 kubenswrapper[4790]: I1011 10:55:27.457301 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bb968bd67-mk4sp"] Oct 11 10:55:28.316757 master-0 kubenswrapper[4790]: I1011 10:55:28.316681 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f87bf9-c348-4149-a72c-99e49db4ec09" path="/var/lib/kubelet/pods/82f87bf9-c348-4149-a72c-99e49db4ec09/volumes" Oct 11 10:55:28.364448 master-0 kubenswrapper[4790]: I1011 10:55:28.364347 4790 generic.go:334] "Generic (PLEG): container finished" podID="7bb6dd47-2665-4a3f-8773-2a61034146a3" containerID="5b50296ba2efac22efde8aae60a1ee89c11a8ace1ff375049f5a9b2bda8f8fc0" exitCode=0 Oct 11 10:55:28.364448 master-0 kubenswrapper[4790]: I1011 10:55:28.364431 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rgsq2" event={"ID":"7bb6dd47-2665-4a3f-8773-2a61034146a3","Type":"ContainerDied","Data":"5b50296ba2efac22efde8aae60a1ee89c11a8ace1ff375049f5a9b2bda8f8fc0"} Oct 11 10:55:28.366386 master-0 kubenswrapper[4790]: I1011 10:55:28.366354 4790 generic.go:334] "Generic (PLEG): container finished" podID="8b0929b8-354d-4de6-9e2d-ac6e11324b10" containerID="ad9c864509e03d2c97f9d070b630e91a99b7a68797b54f4da7ce040e5a112381" exitCode=0 Oct 11 10:55:28.366455 master-0 kubenswrapper[4790]: I1011 10:55:28.366413 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nw6gg" event={"ID":"8b0929b8-354d-4de6-9e2d-ac6e11324b10","Type":"ContainerDied","Data":"ad9c864509e03d2c97f9d070b630e91a99b7a68797b54f4da7ce040e5a112381"} Oct 11 10:55:28.366455 master-0 kubenswrapper[4790]: I1011 10:55:28.366432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nw6gg" event={"ID":"8b0929b8-354d-4de6-9e2d-ac6e11324b10","Type":"ContainerStarted","Data":"99f365cb76e3f76cb36494306ddb4f6a8b5e7332210b71ce165a6676119b559f"} Oct 11 10:55:28.368751 master-0 kubenswrapper[4790]: I1011 10:55:28.368393 4790 generic.go:334] "Generic (PLEG): container finished" podID="5b7ae2a3-6802-400c-bbe7-5729052a2c1c" containerID="187ba140386fe1bdb57e2319039e5e08987bfe52015b52422eac9cff8a82d276" exitCode=0 Oct 11 10:55:28.368751 master-0 kubenswrapper[4790]: I1011 10:55:28.368508 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r9jnj" event={"ID":"5b7ae2a3-6802-400c-bbe7-5729052a2c1c","Type":"ContainerDied","Data":"187ba140386fe1bdb57e2319039e5e08987bfe52015b52422eac9cff8a82d276"} Oct 11 10:55:28.719822 master-0 kubenswrapper[4790]: I1011 10:55:28.718854 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-d55d46749-qq6mv"] Oct 11 10:55:28.719822 master-0 kubenswrapper[4790]: E1011 10:55:28.719242 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f87bf9-c348-4149-a72c-99e49db4ec09" containerName="init" Oct 11 10:55:28.719822 master-0 kubenswrapper[4790]: I1011 10:55:28.719276 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f87bf9-c348-4149-a72c-99e49db4ec09" containerName="init" Oct 11 10:55:28.719822 master-0 kubenswrapper[4790]: I1011 10:55:28.719438 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f87bf9-c348-4149-a72c-99e49db4ec09" containerName="init" Oct 11 10:55:28.723726 master-0 kubenswrapper[4790]: I1011 10:55:28.720823 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.728720 master-0 kubenswrapper[4790]: I1011 10:55:28.724848 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Oct 11 10:55:28.728720 master-0 kubenswrapper[4790]: I1011 10:55:28.726057 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Oct 11 10:55:28.728720 master-0 kubenswrapper[4790]: I1011 10:55:28.726160 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Oct 11 10:55:28.728720 master-0 kubenswrapper[4790]: I1011 10:55:28.726280 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Oct 11 10:55:28.728720 master-0 kubenswrapper[4790]: I1011 10:55:28.726464 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Oct 11 10:55:28.728720 master-0 kubenswrapper[4790]: I1011 10:55:28.727990 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Oct 11 10:55:28.743729 master-0 kubenswrapper[4790]: I1011 10:55:28.743454 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-d55d46749-qq6mv"] Oct 11 10:55:28.889732 master-0 kubenswrapper[4790]: I1011 10:55:28.889540 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4070c53-33f0-488e-80d1-f374f59c96cd-logs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.889732 master-0 kubenswrapper[4790]: I1011 10:55:28.889649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e4070c53-33f0-488e-80d1-f374f59c96cd-etc-podinfo\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.889732 master-0 kubenswrapper[4790]: I1011 10:55:28.889688 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-public-tls-certs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.890001 master-0 kubenswrapper[4790]: I1011 10:55:28.889756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-internal-tls-certs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.890001 master-0 kubenswrapper[4790]: I1011 10:55:28.889793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-scripts\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.893723 master-0 kubenswrapper[4790]: I1011 10:55:28.890111 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data-custom\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.893723 master-0 kubenswrapper[4790]: I1011 10:55:28.890236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.893723 master-0 kubenswrapper[4790]: I1011 10:55:28.890395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhc7\" (UniqueName: \"kubernetes.io/projected/e4070c53-33f0-488e-80d1-f374f59c96cd-kube-api-access-dwhc7\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.893723 master-0 kubenswrapper[4790]: I1011 10:55:28.890533 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-combined-ca-bundle\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.893723 master-0 kubenswrapper[4790]: I1011 10:55:28.890568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data-merged\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.994369 master-0 kubenswrapper[4790]: I1011 10:55:28.994133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.994369 master-0 kubenswrapper[4790]: I1011 10:55:28.994238 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhc7\" (UniqueName: \"kubernetes.io/projected/e4070c53-33f0-488e-80d1-f374f59c96cd-kube-api-access-dwhc7\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.994369 master-0 kubenswrapper[4790]: I1011 10:55:28.994282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-combined-ca-bundle\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.994369 master-0 kubenswrapper[4790]: I1011 10:55:28.994302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data-merged\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.994369 master-0 kubenswrapper[4790]: I1011 10:55:28.994327 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4070c53-33f0-488e-80d1-f374f59c96cd-logs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.996000 master-0 kubenswrapper[4790]: I1011 10:55:28.995969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e4070c53-33f0-488e-80d1-f374f59c96cd-etc-podinfo\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.996075 master-0 kubenswrapper[4790]: I1011 10:55:28.996005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-public-tls-certs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.996075 master-0 kubenswrapper[4790]: I1011 10:55:28.996037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-internal-tls-certs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.996075 master-0 kubenswrapper[4790]: I1011 10:55:28.996062 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-scripts\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.996343 master-0 kubenswrapper[4790]: I1011 10:55:28.996311 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data-merged\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:28.996430 master-0 kubenswrapper[4790]: I1011 10:55:28.996389 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4070c53-33f0-488e-80d1-f374f59c96cd-logs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.000481 master-0 kubenswrapper[4790]: I1011 10:55:28.997913 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data-custom\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.000481 master-0 kubenswrapper[4790]: I1011 10:55:28.999438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-combined-ca-bundle\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.003817 master-0 kubenswrapper[4790]: I1011 10:55:29.003123 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e4070c53-33f0-488e-80d1-f374f59c96cd-etc-podinfo\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.010757 master-0 kubenswrapper[4790]: I1011 10:55:29.005806 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-internal-tls-certs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.010757 master-0 kubenswrapper[4790]: I1011 10:55:29.007236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data-custom\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.011668 master-0 kubenswrapper[4790]: I1011 10:55:29.011613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-public-tls-certs\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.012135 master-0 kubenswrapper[4790]: I1011 10:55:29.012109 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-scripts\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.012906 master-0 kubenswrapper[4790]: I1011 10:55:29.012868 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4070c53-33f0-488e-80d1-f374f59c96cd-config-data\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.016002 master-0 kubenswrapper[4790]: I1011 10:55:29.015942 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhc7\" (UniqueName: \"kubernetes.io/projected/e4070c53-33f0-488e-80d1-f374f59c96cd-kube-api-access-dwhc7\") pod \"ironic-d55d46749-qq6mv\" (UID: \"e4070c53-33f0-488e-80d1-f374f59c96cd\") " pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.058791 master-0 kubenswrapper[4790]: I1011 10:55:29.058691 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:29.590859 master-0 kubenswrapper[4790]: I1011 10:55:29.590804 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-d55d46749-qq6mv"] Oct 11 10:55:29.599493 master-0 kubenswrapper[4790]: I1011 10:55:29.599049 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:29.776436 master-0 kubenswrapper[4790]: I1011 10:55:29.776299 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:29.852800 master-0 kubenswrapper[4790]: I1011 10:55:29.852726 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:29.923394 master-0 kubenswrapper[4790]: I1011 10:55:29.923231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m7jm\" (UniqueName: \"kubernetes.io/projected/7bb6dd47-2665-4a3f-8773-2a61034146a3-kube-api-access-5m7jm\") pod \"7bb6dd47-2665-4a3f-8773-2a61034146a3\" (UID: \"7bb6dd47-2665-4a3f-8773-2a61034146a3\") " Oct 11 10:55:29.931126 master-0 kubenswrapper[4790]: I1011 10:55:29.928512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb6dd47-2665-4a3f-8773-2a61034146a3-kube-api-access-5m7jm" (OuterVolumeSpecName: "kube-api-access-5m7jm") pod "7bb6dd47-2665-4a3f-8773-2a61034146a3" (UID: "7bb6dd47-2665-4a3f-8773-2a61034146a3"). InnerVolumeSpecName "kube-api-access-5m7jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:30.028817 master-0 kubenswrapper[4790]: I1011 10:55:30.028782 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m7jm\" (UniqueName: \"kubernetes.io/projected/7bb6dd47-2665-4a3f-8773-2a61034146a3-kube-api-access-5m7jm\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:30.396233 master-0 kubenswrapper[4790]: I1011 10:55:30.396128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rgsq2" event={"ID":"7bb6dd47-2665-4a3f-8773-2a61034146a3","Type":"ContainerDied","Data":"159cf359ecee116df9e95cf785829a9f2d2d271b6a361ea0f6a5198dc8a786f3"} Oct 11 10:55:30.396233 master-0 kubenswrapper[4790]: I1011 10:55:30.396198 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159cf359ecee116df9e95cf785829a9f2d2d271b6a361ea0f6a5198dc8a786f3" Oct 11 10:55:30.396233 master-0 kubenswrapper[4790]: I1011 10:55:30.396203 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgsq2" Oct 11 10:55:30.397227 master-0 kubenswrapper[4790]: I1011 10:55:30.397181 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-d55d46749-qq6mv" event={"ID":"e4070c53-33f0-488e-80d1-f374f59c96cd","Type":"ContainerStarted","Data":"e3a3e629dbcff296cf5826be81ab4b7bccd4a010c0e1d91ad34fc053597033a5"} Oct 11 10:55:30.539030 master-0 kubenswrapper[4790]: I1011 10:55:30.534289 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:31.415288 master-0 kubenswrapper[4790]: I1011 10:55:31.415212 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="cinder-volume" containerID="cri-o://a9ec38cb1d58ff96f557dd1efeffa25f09991c28ad10ce26a6daa40b2d7694a0" gracePeriod=30 Oct 11 10:55:31.415995 master-0 kubenswrapper[4790]: I1011 10:55:31.415698 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="probe" containerID="cri-o://e192e2becd115690c1d668bc7d78fc93266390a2fb95e6340eaf7c60f1f198e9" gracePeriod=30 Oct 11 10:55:31.603082 master-0 kubenswrapper[4790]: I1011 10:55:31.602984 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768f954cfc-9xg22"] Oct 11 10:55:31.603428 master-0 kubenswrapper[4790]: E1011 10:55:31.603391 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb6dd47-2665-4a3f-8773-2a61034146a3" containerName="mariadb-database-create" Oct 11 10:55:31.603428 master-0 kubenswrapper[4790]: I1011 10:55:31.603412 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb6dd47-2665-4a3f-8773-2a61034146a3" containerName="mariadb-database-create" Oct 11 10:55:31.603581 master-0 kubenswrapper[4790]: I1011 10:55:31.603557 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb6dd47-2665-4a3f-8773-2a61034146a3" containerName="mariadb-database-create" Oct 11 10:55:31.604541 master-0 kubenswrapper[4790]: I1011 10:55:31.604495 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.607980 master-0 kubenswrapper[4790]: I1011 10:55:31.607927 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 10:55:31.608488 master-0 kubenswrapper[4790]: I1011 10:55:31.608447 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:55:31.608896 master-0 kubenswrapper[4790]: I1011 10:55:31.608837 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:55:31.609032 master-0 kubenswrapper[4790]: I1011 10:55:31.608984 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 11 10:55:31.609156 master-0 kubenswrapper[4790]: I1011 10:55:31.609118 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 10:55:31.626103 master-0 kubenswrapper[4790]: I1011 10:55:31.626049 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768f954cfc-9xg22"] Oct 11 10:55:31.699100 master-0 kubenswrapper[4790]: I1011 10:55:31.698931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-svc\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.699100 master-0 kubenswrapper[4790]: I1011 10:55:31.699014 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-nb\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.699362 master-0 kubenswrapper[4790]: I1011 10:55:31.699111 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9f4\" (UniqueName: \"kubernetes.io/projected/6ff24705-c685-47d9-ad1b-9ec04c541bf7-kube-api-access-hb9f4\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.699362 master-0 kubenswrapper[4790]: I1011 10:55:31.699143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-swift-storage-0\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.699362 master-0 kubenswrapper[4790]: I1011 10:55:31.699183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-sb\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.699362 master-0 kubenswrapper[4790]: I1011 10:55:31.699225 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-config\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.801677 master-0 kubenswrapper[4790]: I1011 10:55:31.801613 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-svc\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.801677 master-0 kubenswrapper[4790]: I1011 10:55:31.801680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-nb\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.801996 master-0 kubenswrapper[4790]: I1011 10:55:31.801771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9f4\" (UniqueName: \"kubernetes.io/projected/6ff24705-c685-47d9-ad1b-9ec04c541bf7-kube-api-access-hb9f4\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.801996 master-0 kubenswrapper[4790]: I1011 10:55:31.801796 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-swift-storage-0\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.801996 master-0 kubenswrapper[4790]: I1011 10:55:31.801831 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-sb\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.801996 master-0 kubenswrapper[4790]: I1011 10:55:31.801867 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-config\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.802853 master-0 kubenswrapper[4790]: I1011 10:55:31.802795 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-config\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.802979 master-0 kubenswrapper[4790]: I1011 10:55:31.802905 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-svc\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.803906 master-0 kubenswrapper[4790]: I1011 10:55:31.803880 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-nb\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.804819 master-0 kubenswrapper[4790]: I1011 10:55:31.804783 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-sb\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.805163 master-0 kubenswrapper[4790]: I1011 10:55:31.805110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-swift-storage-0\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.854997 master-0 kubenswrapper[4790]: I1011 10:55:31.854802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9f4\" (UniqueName: \"kubernetes.io/projected/6ff24705-c685-47d9-ad1b-9ec04c541bf7-kube-api-access-hb9f4\") pod \"dnsmasq-dns-768f954cfc-9xg22\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:31.945559 master-0 kubenswrapper[4790]: I1011 10:55:31.945464 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:32.424481 master-0 kubenswrapper[4790]: I1011 10:55:32.424416 4790 generic.go:334] "Generic (PLEG): container finished" podID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerID="a9ec38cb1d58ff96f557dd1efeffa25f09991c28ad10ce26a6daa40b2d7694a0" exitCode=0 Oct 11 10:55:32.424481 master-0 kubenswrapper[4790]: I1011 10:55:32.424474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"60d68c10-8e1c-4a92-86f6-e2925df0f714","Type":"ContainerDied","Data":"a9ec38cb1d58ff96f557dd1efeffa25f09991c28ad10ce26a6daa40b2d7694a0"} Oct 11 10:55:33.442353 master-0 kubenswrapper[4790]: I1011 10:55:33.442279 4790 generic.go:334] "Generic (PLEG): container finished" podID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerID="e192e2becd115690c1d668bc7d78fc93266390a2fb95e6340eaf7c60f1f198e9" exitCode=0 Oct 11 10:55:33.442353 master-0 kubenswrapper[4790]: I1011 10:55:33.442347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"60d68c10-8e1c-4a92-86f6-e2925df0f714","Type":"ContainerDied","Data":"e192e2becd115690c1d668bc7d78fc93266390a2fb95e6340eaf7c60f1f198e9"} Oct 11 10:55:34.916350 master-0 kubenswrapper[4790]: I1011 10:55:34.916280 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:55:35.381670 master-0 kubenswrapper[4790]: I1011 10:55:35.381636 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:35.391906 master-0 kubenswrapper[4790]: I1011 10:55:35.389887 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:35.478689 master-0 kubenswrapper[4790]: I1011 10:55:35.478514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nw6gg" event={"ID":"8b0929b8-354d-4de6-9e2d-ac6e11324b10","Type":"ContainerDied","Data":"99f365cb76e3f76cb36494306ddb4f6a8b5e7332210b71ce165a6676119b559f"} Oct 11 10:55:35.478689 master-0 kubenswrapper[4790]: I1011 10:55:35.478612 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f365cb76e3f76cb36494306ddb4f6a8b5e7332210b71ce165a6676119b559f" Oct 11 10:55:35.479010 master-0 kubenswrapper[4790]: I1011 10:55:35.478963 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nw6gg" Oct 11 10:55:35.480559 master-0 kubenswrapper[4790]: I1011 10:55:35.480495 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r9jnj" event={"ID":"5b7ae2a3-6802-400c-bbe7-5729052a2c1c","Type":"ContainerDied","Data":"4074522a220cfcc13675ec89ed6e6addf00a02c34d3e5d2f86e64b3f545d3cad"} Oct 11 10:55:35.480621 master-0 kubenswrapper[4790]: I1011 10:55:35.480563 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4074522a220cfcc13675ec89ed6e6addf00a02c34d3e5d2f86e64b3f545d3cad" Oct 11 10:55:35.480621 master-0 kubenswrapper[4790]: I1011 10:55:35.480579 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r9jnj" Oct 11 10:55:35.495586 master-0 kubenswrapper[4790]: I1011 10:55:35.495531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcmpk\" (UniqueName: \"kubernetes.io/projected/5b7ae2a3-6802-400c-bbe7-5729052a2c1c-kube-api-access-kcmpk\") pod \"5b7ae2a3-6802-400c-bbe7-5729052a2c1c\" (UID: \"5b7ae2a3-6802-400c-bbe7-5729052a2c1c\") " Oct 11 10:55:35.495898 master-0 kubenswrapper[4790]: I1011 10:55:35.495735 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w4cd\" (UniqueName: \"kubernetes.io/projected/8b0929b8-354d-4de6-9e2d-ac6e11324b10-kube-api-access-7w4cd\") pod \"8b0929b8-354d-4de6-9e2d-ac6e11324b10\" (UID: \"8b0929b8-354d-4de6-9e2d-ac6e11324b10\") " Oct 11 10:55:35.511631 master-0 kubenswrapper[4790]: I1011 10:55:35.508381 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7ae2a3-6802-400c-bbe7-5729052a2c1c-kube-api-access-kcmpk" (OuterVolumeSpecName: "kube-api-access-kcmpk") pod "5b7ae2a3-6802-400c-bbe7-5729052a2c1c" (UID: "5b7ae2a3-6802-400c-bbe7-5729052a2c1c"). InnerVolumeSpecName "kube-api-access-kcmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:35.511631 master-0 kubenswrapper[4790]: I1011 10:55:35.509582 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0929b8-354d-4de6-9e2d-ac6e11324b10-kube-api-access-7w4cd" (OuterVolumeSpecName: "kube-api-access-7w4cd") pod "8b0929b8-354d-4de6-9e2d-ac6e11324b10" (UID: "8b0929b8-354d-4de6-9e2d-ac6e11324b10"). InnerVolumeSpecName "kube-api-access-7w4cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:35.597664 master-0 kubenswrapper[4790]: I1011 10:55:35.597579 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcmpk\" (UniqueName: \"kubernetes.io/projected/5b7ae2a3-6802-400c-bbe7-5729052a2c1c-kube-api-access-kcmpk\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:35.597664 master-0 kubenswrapper[4790]: I1011 10:55:35.597620 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w4cd\" (UniqueName: \"kubernetes.io/projected/8b0929b8-354d-4de6-9e2d-ac6e11324b10-kube-api-access-7w4cd\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:35.828761 master-0 kubenswrapper[4790]: I1011 10:55:35.828722 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.003988 master-0 kubenswrapper[4790]: I1011 10:55:36.003944 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-run\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004076 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-scripts\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004099 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-lib-cinder\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004130 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-cinder\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004178 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-combined-ca-bundle\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004179 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-run" (OuterVolumeSpecName: "run") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-iscsi\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004260 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004413 master-0 kubenswrapper[4790]: I1011 10:55:36.004391 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-sys\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004673 master-0 kubenswrapper[4790]: I1011 10:55:36.004454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-brick\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004673 master-0 kubenswrapper[4790]: I1011 10:55:36.004487 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-dev\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004673 master-0 kubenswrapper[4790]: I1011 10:55:36.004520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-lib-modules\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004673 master-0 kubenswrapper[4790]: I1011 10:55:36.004510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-sys" (OuterVolumeSpecName: "sys") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004673 master-0 kubenswrapper[4790]: I1011 10:55:36.004601 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-nvme\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004673 master-0 kubenswrapper[4790]: I1011 10:55:36.004639 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-machine-id\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-dev" (OuterVolumeSpecName: "dev") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004628 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004651 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004823 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data-custom\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.004875 master-0 kubenswrapper[4790]: I1011 10:55:36.004870 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:36.005092 master-0 kubenswrapper[4790]: I1011 10:55:36.004932 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k54k8\" (UniqueName: \"kubernetes.io/projected/60d68c10-8e1c-4a92-86f6-e2925df0f714-kube-api-access-k54k8\") pod \"60d68c10-8e1c-4a92-86f6-e2925df0f714\" (UID: \"60d68c10-8e1c-4a92-86f6-e2925df0f714\") " Oct 11 10:55:36.006062 master-0 kubenswrapper[4790]: I1011 10:55:36.006021 4790 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-sys\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006062 master-0 kubenswrapper[4790]: I1011 10:55:36.006050 4790 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006070 4790 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-dev\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006086 4790 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-lib-modules\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006099 4790 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-nvme\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006113 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006126 4790 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-run\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006140 4790 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006153 master-0 kubenswrapper[4790]: I1011 10:55:36.006153 4790 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.006357 master-0 kubenswrapper[4790]: I1011 10:55:36.006169 4790 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60d68c10-8e1c-4a92-86f6-e2925df0f714-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.007947 master-0 kubenswrapper[4790]: I1011 10:55:36.007893 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-scripts" (OuterVolumeSpecName: "scripts") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:36.011275 master-0 kubenswrapper[4790]: I1011 10:55:36.011237 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:36.015570 master-0 kubenswrapper[4790]: I1011 10:55:36.015500 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d68c10-8e1c-4a92-86f6-e2925df0f714-kube-api-access-k54k8" (OuterVolumeSpecName: "kube-api-access-k54k8") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "kube-api-access-k54k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:36.086657 master-0 kubenswrapper[4790]: I1011 10:55:36.086386 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:36.108119 master-0 kubenswrapper[4790]: I1011 10:55:36.108069 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.108119 master-0 kubenswrapper[4790]: I1011 10:55:36.108110 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.108255 master-0 kubenswrapper[4790]: I1011 10:55:36.108147 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data-custom\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.108255 master-0 kubenswrapper[4790]: I1011 10:55:36.108165 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k54k8\" (UniqueName: \"kubernetes.io/projected/60d68c10-8e1c-4a92-86f6-e2925df0f714-kube-api-access-k54k8\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.151732 master-0 kubenswrapper[4790]: I1011 10:55:36.145784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data" (OuterVolumeSpecName: "config-data") pod "60d68c10-8e1c-4a92-86f6-e2925df0f714" (UID: "60d68c10-8e1c-4a92-86f6-e2925df0f714"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:36.212335 master-0 kubenswrapper[4790]: I1011 10:55:36.212269 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d68c10-8e1c-4a92-86f6-e2925df0f714-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:36.246947 master-0 kubenswrapper[4790]: I1011 10:55:36.246245 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768f954cfc-9xg22"] Oct 11 10:55:36.256413 master-0 kubenswrapper[4790]: W1011 10:55:36.251998 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ff24705_c685_47d9_ad1b_9ec04c541bf7.slice/crio-2e505acf6ba0dd723d6c70412053c327d464e773f7d7a12371848c3428f7bbf6 WatchSource:0}: Error finding container 2e505acf6ba0dd723d6c70412053c327d464e773f7d7a12371848c3428f7bbf6: Status 404 returned error can't find the container with id 2e505acf6ba0dd723d6c70412053c327d464e773f7d7a12371848c3428f7bbf6 Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: I1011 10:55:36.347627 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8a39-account-create-clqqg"] Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: E1011 10:55:36.348645 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7ae2a3-6802-400c-bbe7-5729052a2c1c" containerName="mariadb-database-create" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: I1011 10:55:36.348676 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7ae2a3-6802-400c-bbe7-5729052a2c1c" containerName="mariadb-database-create" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: E1011 10:55:36.348724 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0929b8-354d-4de6-9e2d-ac6e11324b10" containerName="mariadb-database-create" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: I1011 10:55:36.348735 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0929b8-354d-4de6-9e2d-ac6e11324b10" containerName="mariadb-database-create" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: E1011 10:55:36.348753 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="cinder-volume" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: I1011 10:55:36.348759 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="cinder-volume" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: E1011 10:55:36.349048 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="probe" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: I1011 10:55:36.349062 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="probe" Oct 11 10:55:36.349447 master-0 kubenswrapper[4790]: I1011 10:55:36.349448 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0929b8-354d-4de6-9e2d-ac6e11324b10" containerName="mariadb-database-create" Oct 11 10:55:36.349894 master-0 kubenswrapper[4790]: I1011 10:55:36.349481 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="cinder-volume" Oct 11 10:55:36.349894 master-0 kubenswrapper[4790]: I1011 10:55:36.349496 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7ae2a3-6802-400c-bbe7-5729052a2c1c" containerName="mariadb-database-create" Oct 11 10:55:36.349894 master-0 kubenswrapper[4790]: I1011 10:55:36.349509 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" containerName="probe" Oct 11 10:55:36.350561 master-0 kubenswrapper[4790]: I1011 10:55:36.350519 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:36.356499 master-0 kubenswrapper[4790]: I1011 10:55:36.356440 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8a39-account-create-clqqg"] Oct 11 10:55:36.360327 master-0 kubenswrapper[4790]: I1011 10:55:36.359522 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 11 10:55:36.504480 master-0 kubenswrapper[4790]: I1011 10:55:36.504431 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f5c98dcd5-c8xhk" event={"ID":"c41b60ff-457d-4c45-8b56-4523c5c0097f","Type":"ContainerStarted","Data":"d68ca291cf2b0da68d6a8d6c151be0aa939d7b802691ee00796f934bd8619918"} Oct 11 10:55:36.504883 master-0 kubenswrapper[4790]: I1011 10:55:36.504859 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5f5c98dcd5-c8xhk" podUID="c41b60ff-457d-4c45-8b56-4523c5c0097f" containerName="heat-api" containerID="cri-o://d68ca291cf2b0da68d6a8d6c151be0aa939d7b802691ee00796f934bd8619918" gracePeriod=60 Oct 11 10:55:36.505317 master-0 kubenswrapper[4790]: I1011 10:55:36.505302 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:36.512307 master-0 kubenswrapper[4790]: I1011 10:55:36.511834 4790 generic.go:334] "Generic (PLEG): container finished" podID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerID="3509395ffa91a1e45916358a3e4f7592cd6af5d4e7f1b22db61be9000daad5af" exitCode=1 Oct 11 10:55:36.513069 master-0 kubenswrapper[4790]: I1011 10:55:36.513045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" event={"ID":"90ca8fc6-bc53-461b-8384-ca8344e8abb1","Type":"ContainerDied","Data":"3509395ffa91a1e45916358a3e4f7592cd6af5d4e7f1b22db61be9000daad5af"} Oct 11 10:55:36.516128 master-0 kubenswrapper[4790]: I1011 10:55:36.513679 4790 scope.go:117] "RemoveContainer" containerID="3509395ffa91a1e45916358a3e4f7592cd6af5d4e7f1b22db61be9000daad5af" Oct 11 10:55:36.524347 master-0 kubenswrapper[4790]: I1011 10:55:36.523797 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" event={"ID":"6ff24705-c685-47d9-ad1b-9ec04c541bf7","Type":"ContainerStarted","Data":"2e505acf6ba0dd723d6c70412053c327d464e773f7d7a12371848c3428f7bbf6"} Oct 11 10:55:36.558665 master-0 kubenswrapper[4790]: I1011 10:55:36.557937 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"60d68c10-8e1c-4a92-86f6-e2925df0f714","Type":"ContainerDied","Data":"3242bfe4a3531200a62ec255f320d35782117671872a6959f988bac26f9f7d3f"} Oct 11 10:55:36.558665 master-0 kubenswrapper[4790]: I1011 10:55:36.558010 4790 scope.go:117] "RemoveContainer" containerID="e192e2becd115690c1d668bc7d78fc93266390a2fb95e6340eaf7c60f1f198e9" Oct 11 10:55:36.558665 master-0 kubenswrapper[4790]: I1011 10:55:36.558065 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.565645 master-0 kubenswrapper[4790]: I1011 10:55:36.565089 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnz2\" (UniqueName: \"kubernetes.io/projected/e37e1fe6-6e89-4407-a40f-cf494a35eccd-kube-api-access-ptnz2\") pod \"nova-cell0-8a39-account-create-clqqg\" (UID: \"e37e1fe6-6e89-4407-a40f-cf494a35eccd\") " pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:36.587072 master-0 kubenswrapper[4790]: I1011 10:55:36.586959 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5f5c98dcd5-c8xhk" podStartSLOduration=9.944004454 podStartE2EDuration="20.586920368s" podCreationTimestamp="2025-10-11 10:55:16 +0000 UTC" firstStartedPulling="2025-10-11 10:55:25.217346588 +0000 UTC m=+1001.771806880" lastFinishedPulling="2025-10-11 10:55:35.860262512 +0000 UTC m=+1012.414722794" observedRunningTime="2025-10-11 10:55:36.55605203 +0000 UTC m=+1013.110512322" watchObservedRunningTime="2025-10-11 10:55:36.586920368 +0000 UTC m=+1013.141380660" Oct 11 10:55:36.603053 master-0 kubenswrapper[4790]: I1011 10:55:36.603002 4790 scope.go:117] "RemoveContainer" containerID="a9ec38cb1d58ff96f557dd1efeffa25f09991c28ad10ce26a6daa40b2d7694a0" Oct 11 10:55:36.630942 master-0 kubenswrapper[4790]: I1011 10:55:36.630813 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:36.637016 master-0 kubenswrapper[4790]: I1011 10:55:36.636981 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:36.666956 master-0 kubenswrapper[4790]: I1011 10:55:36.666752 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:36.666956 master-0 kubenswrapper[4790]: I1011 10:55:36.666943 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnz2\" (UniqueName: \"kubernetes.io/projected/e37e1fe6-6e89-4407-a40f-cf494a35eccd-kube-api-access-ptnz2\") pod \"nova-cell0-8a39-account-create-clqqg\" (UID: \"e37e1fe6-6e89-4407-a40f-cf494a35eccd\") " pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:36.668778 master-0 kubenswrapper[4790]: I1011 10:55:36.668739 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.674998 master-0 kubenswrapper[4790]: I1011 10:55:36.674953 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b5802-volume-lvm-iscsi-config-data" Oct 11 10:55:36.682271 master-0 kubenswrapper[4790]: I1011 10:55:36.682201 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:36.694734 master-0 kubenswrapper[4790]: I1011 10:55:36.694297 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnz2\" (UniqueName: \"kubernetes.io/projected/e37e1fe6-6e89-4407-a40f-cf494a35eccd-kube-api-access-ptnz2\") pod \"nova-cell0-8a39-account-create-clqqg\" (UID: \"e37e1fe6-6e89-4407-a40f-cf494a35eccd\") " pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:36.770958 master-0 kubenswrapper[4790]: I1011 10:55:36.770902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-combined-ca-bundle\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.771334 master-0 kubenswrapper[4790]: I1011 10:55:36.771316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-scripts\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.771465 master-0 kubenswrapper[4790]: I1011 10:55:36.771451 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-dev\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.771697 master-0 kubenswrapper[4790]: I1011 10:55:36.771682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-sys\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772110 master-0 kubenswrapper[4790]: I1011 10:55:36.772078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-config-data-custom\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772173 master-0 kubenswrapper[4790]: I1011 10:55:36.772139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-locks-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772231 master-0 kubenswrapper[4790]: I1011 10:55:36.772208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmgf8\" (UniqueName: \"kubernetes.io/projected/24482e3e-ba4c-4920-90d4-077df9a7b329-kube-api-access-vmgf8\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772299 master-0 kubenswrapper[4790]: I1011 10:55:36.772272 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-lib-modules\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772477 master-0 kubenswrapper[4790]: I1011 10:55:36.772462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-iscsi\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772592 master-0 kubenswrapper[4790]: I1011 10:55:36.772574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-config-data\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.772970 master-0 kubenswrapper[4790]: I1011 10:55:36.772908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-lib-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.773106 master-0 kubenswrapper[4790]: I1011 10:55:36.773092 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-machine-id\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.773277 master-0 kubenswrapper[4790]: I1011 10:55:36.773263 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-nvme\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.773395 master-0 kubenswrapper[4790]: I1011 10:55:36.773381 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-run\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.773585 master-0 kubenswrapper[4790]: I1011 10:55:36.773542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-locks-brick\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.884996 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-config-data-custom\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-locks-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmgf8\" (UniqueName: \"kubernetes.io/projected/24482e3e-ba4c-4920-90d4-077df9a7b329-kube-api-access-vmgf8\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885121 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-lib-modules\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-iscsi\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-config-data\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-lib-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885220 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-machine-id\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-nvme\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885267 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-run\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885287 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-locks-brick\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-combined-ca-bundle\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-scripts\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-dev\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-sys\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.885755 master-0 kubenswrapper[4790]: I1011 10:55:36.885506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-sys\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.886664 master-0 kubenswrapper[4790]: I1011 10:55:36.886221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-machine-id\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.886664 master-0 kubenswrapper[4790]: I1011 10:55:36.886385 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-locks-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.886905 master-0 kubenswrapper[4790]: I1011 10:55:36.886875 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-lib-modules\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.887008 master-0 kubenswrapper[4790]: I1011 10:55:36.886930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-iscsi\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.887084 master-0 kubenswrapper[4790]: I1011 10:55:36.886888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-locks-brick\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.887408 master-0 kubenswrapper[4790]: I1011 10:55:36.887305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-etc-nvme\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.887551 master-0 kubenswrapper[4790]: I1011 10:55:36.887532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-run\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.890851 master-0 kubenswrapper[4790]: I1011 10:55:36.888529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-dev\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.890851 master-0 kubenswrapper[4790]: I1011 10:55:36.889570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-config-data-custom\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.890851 master-0 kubenswrapper[4790]: I1011 10:55:36.889643 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24482e3e-ba4c-4920-90d4-077df9a7b329-var-lib-cinder\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.892738 master-0 kubenswrapper[4790]: I1011 10:55:36.892127 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-combined-ca-bundle\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.893052 master-0 kubenswrapper[4790]: I1011 10:55:36.892850 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:36.897131 master-0 kubenswrapper[4790]: I1011 10:55:36.894349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-config-data\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.909427 master-0 kubenswrapper[4790]: I1011 10:55:36.909338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24482e3e-ba4c-4920-90d4-077df9a7b329-scripts\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:36.918386 master-0 kubenswrapper[4790]: I1011 10:55:36.918214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmgf8\" (UniqueName: \"kubernetes.io/projected/24482e3e-ba4c-4920-90d4-077df9a7b329-kube-api-access-vmgf8\") pod \"cinder-b5802-volume-lvm-iscsi-0\" (UID: \"24482e3e-ba4c-4920-90d4-077df9a7b329\") " pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:37.100054 master-0 kubenswrapper[4790]: I1011 10:55:37.099952 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:37.370590 master-0 kubenswrapper[4790]: I1011 10:55:37.370550 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8a39-account-create-clqqg"] Oct 11 10:55:37.402603 master-0 kubenswrapper[4790]: I1011 10:55:37.402213 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:37.594732 master-0 kubenswrapper[4790]: I1011 10:55:37.590952 4790 generic.go:334] "Generic (PLEG): container finished" podID="c41b60ff-457d-4c45-8b56-4523c5c0097f" containerID="d68ca291cf2b0da68d6a8d6c151be0aa939d7b802691ee00796f934bd8619918" exitCode=0 Oct 11 10:55:37.594732 master-0 kubenswrapper[4790]: I1011 10:55:37.591051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f5c98dcd5-c8xhk" event={"ID":"c41b60ff-457d-4c45-8b56-4523c5c0097f","Type":"ContainerDied","Data":"d68ca291cf2b0da68d6a8d6c151be0aa939d7b802691ee00796f934bd8619918"} Oct 11 10:55:37.594732 master-0 kubenswrapper[4790]: I1011 10:55:37.591136 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f5c98dcd5-c8xhk" event={"ID":"c41b60ff-457d-4c45-8b56-4523c5c0097f","Type":"ContainerDied","Data":"c3de8b5a223cd729b6c034e2228715eb85d26de982d08c60dc47229ac7d1b110"} Oct 11 10:55:37.594732 master-0 kubenswrapper[4790]: I1011 10:55:37.591152 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3de8b5a223cd729b6c034e2228715eb85d26de982d08c60dc47229ac7d1b110" Oct 11 10:55:37.600743 master-0 kubenswrapper[4790]: I1011 10:55:37.597591 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a39-account-create-clqqg" event={"ID":"e37e1fe6-6e89-4407-a40f-cf494a35eccd","Type":"ContainerStarted","Data":"5bb2b37cb0135387058a9a5780b75d200e3d6755f1f8d0b825a04e68757b6f14"} Oct 11 10:55:37.600743 master-0 kubenswrapper[4790]: I1011 10:55:37.598119 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:37.600743 master-0 kubenswrapper[4790]: I1011 10:55:37.600128 4790 generic.go:334] "Generic (PLEG): container finished" podID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerID="471d5706c589d3886adf86b8614c32749579ddcf4ce9dfdf7d21941669c457bf" exitCode=1 Oct 11 10:55:37.600743 master-0 kubenswrapper[4790]: I1011 10:55:37.600697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" event={"ID":"90ca8fc6-bc53-461b-8384-ca8344e8abb1","Type":"ContainerDied","Data":"471d5706c589d3886adf86b8614c32749579ddcf4ce9dfdf7d21941669c457bf"} Oct 11 10:55:37.600918 master-0 kubenswrapper[4790]: I1011 10:55:37.600773 4790 scope.go:117] "RemoveContainer" containerID="3509395ffa91a1e45916358a3e4f7592cd6af5d4e7f1b22db61be9000daad5af" Oct 11 10:55:37.600918 master-0 kubenswrapper[4790]: I1011 10:55:37.600818 4790 scope.go:117] "RemoveContainer" containerID="471d5706c589d3886adf86b8614c32749579ddcf4ce9dfdf7d21941669c457bf" Oct 11 10:55:37.601778 master-0 kubenswrapper[4790]: E1011 10:55:37.601059 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-89f9b4488-8vvt9_openstack(90ca8fc6-bc53-461b-8384-ca8344e8abb1)\"" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" Oct 11 10:55:37.604035 master-0 kubenswrapper[4790]: I1011 10:55:37.603964 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerID="7210d87c28a292af798e5994b8f7c1185cbe0c9dd8ab3744872cfdcf6e01c602" exitCode=0 Oct 11 10:55:37.604112 master-0 kubenswrapper[4790]: I1011 10:55:37.604046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" event={"ID":"6ff24705-c685-47d9-ad1b-9ec04c541bf7","Type":"ContainerDied","Data":"7210d87c28a292af798e5994b8f7c1185cbe0c9dd8ab3744872cfdcf6e01c602"} Oct 11 10:55:37.660587 master-0 kubenswrapper[4790]: I1011 10:55:37.660495 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-volume-lvm-iscsi-0"] Oct 11 10:55:37.703618 master-0 kubenswrapper[4790]: I1011 10:55:37.703559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data-custom\") pod \"c41b60ff-457d-4c45-8b56-4523c5c0097f\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " Oct 11 10:55:37.703738 master-0 kubenswrapper[4790]: I1011 10:55:37.703695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-combined-ca-bundle\") pod \"c41b60ff-457d-4c45-8b56-4523c5c0097f\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " Oct 11 10:55:37.703834 master-0 kubenswrapper[4790]: I1011 10:55:37.703808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb8r8\" (UniqueName: \"kubernetes.io/projected/c41b60ff-457d-4c45-8b56-4523c5c0097f-kube-api-access-bb8r8\") pod \"c41b60ff-457d-4c45-8b56-4523c5c0097f\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " Oct 11 10:55:37.703874 master-0 kubenswrapper[4790]: I1011 10:55:37.703865 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data\") pod \"c41b60ff-457d-4c45-8b56-4523c5c0097f\" (UID: \"c41b60ff-457d-4c45-8b56-4523c5c0097f\") " Oct 11 10:55:37.711919 master-0 kubenswrapper[4790]: I1011 10:55:37.711483 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41b60ff-457d-4c45-8b56-4523c5c0097f-kube-api-access-bb8r8" (OuterVolumeSpecName: "kube-api-access-bb8r8") pod "c41b60ff-457d-4c45-8b56-4523c5c0097f" (UID: "c41b60ff-457d-4c45-8b56-4523c5c0097f"). InnerVolumeSpecName "kube-api-access-bb8r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:37.713449 master-0 kubenswrapper[4790]: I1011 10:55:37.713391 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c41b60ff-457d-4c45-8b56-4523c5c0097f" (UID: "c41b60ff-457d-4c45-8b56-4523c5c0097f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:37.742900 master-0 kubenswrapper[4790]: I1011 10:55:37.742700 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c41b60ff-457d-4c45-8b56-4523c5c0097f" (UID: "c41b60ff-457d-4c45-8b56-4523c5c0097f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:37.755398 master-0 kubenswrapper[4790]: I1011 10:55:37.755337 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data" (OuterVolumeSpecName: "config-data") pod "c41b60ff-457d-4c45-8b56-4523c5c0097f" (UID: "c41b60ff-457d-4c45-8b56-4523c5c0097f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:37.806205 master-0 kubenswrapper[4790]: I1011 10:55:37.806161 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data-custom\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:37.806205 master-0 kubenswrapper[4790]: I1011 10:55:37.806199 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:37.806205 master-0 kubenswrapper[4790]: I1011 10:55:37.806210 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb8r8\" (UniqueName: \"kubernetes.io/projected/c41b60ff-457d-4c45-8b56-4523c5c0097f-kube-api-access-bb8r8\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:37.806485 master-0 kubenswrapper[4790]: I1011 10:55:37.806220 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41b60ff-457d-4c45-8b56-4523c5c0097f-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:38.314996 master-0 kubenswrapper[4790]: I1011 10:55:38.314791 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d68c10-8e1c-4a92-86f6-e2925df0f714" path="/var/lib/kubelet/pods/60d68c10-8e1c-4a92-86f6-e2925df0f714/volumes" Oct 11 10:55:38.619287 master-0 kubenswrapper[4790]: I1011 10:55:38.619072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" event={"ID":"6ff24705-c685-47d9-ad1b-9ec04c541bf7","Type":"ContainerStarted","Data":"585d3075d62338a2a9a42f35ce1fe0086b7e13eaad601eec48eba4bae373241a"} Oct 11 10:55:38.620759 master-0 kubenswrapper[4790]: I1011 10:55:38.620681 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:38.625377 master-0 kubenswrapper[4790]: I1011 10:55:38.625327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"24482e3e-ba4c-4920-90d4-077df9a7b329","Type":"ContainerStarted","Data":"4bbf3b03ee3ac7a6c2a580b58d789d9524ecea2f239e33911feb5ef644b2631e"} Oct 11 10:55:38.625377 master-0 kubenswrapper[4790]: I1011 10:55:38.625366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"24482e3e-ba4c-4920-90d4-077df9a7b329","Type":"ContainerStarted","Data":"0413f5f4f9ae719fc0d00825e1ce60e01e59a9f8594c84b36f853ea751b147e2"} Oct 11 10:55:38.627822 master-0 kubenswrapper[4790]: I1011 10:55:38.627781 4790 generic.go:334] "Generic (PLEG): container finished" podID="e37e1fe6-6e89-4407-a40f-cf494a35eccd" containerID="ef89d0976a05facc749dfabb5416524787541999145463ce1f713dd9a9f315fb" exitCode=0 Oct 11 10:55:38.627962 master-0 kubenswrapper[4790]: I1011 10:55:38.627830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a39-account-create-clqqg" event={"ID":"e37e1fe6-6e89-4407-a40f-cf494a35eccd","Type":"ContainerDied","Data":"ef89d0976a05facc749dfabb5416524787541999145463ce1f713dd9a9f315fb"} Oct 11 10:55:38.629590 master-0 kubenswrapper[4790]: I1011 10:55:38.629555 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5c98dcd5-c8xhk" Oct 11 10:55:38.632778 master-0 kubenswrapper[4790]: I1011 10:55:38.632403 4790 scope.go:117] "RemoveContainer" containerID="471d5706c589d3886adf86b8614c32749579ddcf4ce9dfdf7d21941669c457bf" Oct 11 10:55:38.633826 master-0 kubenswrapper[4790]: E1011 10:55:38.633001 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-89f9b4488-8vvt9_openstack(90ca8fc6-bc53-461b-8384-ca8344e8abb1)\"" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" Oct 11 10:55:38.658566 master-0 kubenswrapper[4790]: I1011 10:55:38.658467 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" podStartSLOduration=7.658441317 podStartE2EDuration="7.658441317s" podCreationTimestamp="2025-10-11 10:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:55:38.651774644 +0000 UTC m=+1015.206234976" watchObservedRunningTime="2025-10-11 10:55:38.658441317 +0000 UTC m=+1015.212901609" Oct 11 10:55:38.678478 master-0 kubenswrapper[4790]: I1011 10:55:38.678399 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f5c98dcd5-c8xhk"] Oct 11 10:55:38.692725 master-0 kubenswrapper[4790]: I1011 10:55:38.692654 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f5c98dcd5-c8xhk"] Oct 11 10:55:40.280528 master-0 kubenswrapper[4790]: I1011 10:55:40.280314 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:55:40.281810 master-0 kubenswrapper[4790]: I1011 10:55:40.281677 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b5802-default-internal-api-2" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-log" containerID="cri-o://373e10b5f3797c10bf2f289215e447ba6ede87d322784f4e67d64c872023c089" gracePeriod=30 Oct 11 10:55:40.282132 master-0 kubenswrapper[4790]: I1011 10:55:40.282076 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b5802-default-internal-api-2" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-httpd" containerID="cri-o://c15b47f90fa6f6a74e4ba2cf37c1852384ef4ec1575e5ae98a981dfe2894d2d7" gracePeriod=30 Oct 11 10:55:40.317403 master-0 kubenswrapper[4790]: I1011 10:55:40.317334 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41b60ff-457d-4c45-8b56-4523c5c0097f" path="/var/lib/kubelet/pods/c41b60ff-457d-4c45-8b56-4523c5c0097f/volumes" Oct 11 10:55:40.480188 master-0 kubenswrapper[4790]: I1011 10:55:40.478054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:40.568191 master-0 kubenswrapper[4790]: I1011 10:55:40.567403 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptnz2\" (UniqueName: \"kubernetes.io/projected/e37e1fe6-6e89-4407-a40f-cf494a35eccd-kube-api-access-ptnz2\") pod \"e37e1fe6-6e89-4407-a40f-cf494a35eccd\" (UID: \"e37e1fe6-6e89-4407-a40f-cf494a35eccd\") " Oct 11 10:55:40.592223 master-0 kubenswrapper[4790]: I1011 10:55:40.592118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37e1fe6-6e89-4407-a40f-cf494a35eccd-kube-api-access-ptnz2" (OuterVolumeSpecName: "kube-api-access-ptnz2") pod "e37e1fe6-6e89-4407-a40f-cf494a35eccd" (UID: "e37e1fe6-6e89-4407-a40f-cf494a35eccd"). InnerVolumeSpecName "kube-api-access-ptnz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:40.654829 master-0 kubenswrapper[4790]: I1011 10:55:40.654763 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerID="373e10b5f3797c10bf2f289215e447ba6ede87d322784f4e67d64c872023c089" exitCode=143 Oct 11 10:55:40.655384 master-0 kubenswrapper[4790]: I1011 10:55:40.654848 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"a0a5aa40-0146-4b81-83dd-761d514c557a","Type":"ContainerDied","Data":"373e10b5f3797c10bf2f289215e447ba6ede87d322784f4e67d64c872023c089"} Oct 11 10:55:40.657190 master-0 kubenswrapper[4790]: I1011 10:55:40.657158 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a39-account-create-clqqg" Oct 11 10:55:40.665490 master-0 kubenswrapper[4790]: I1011 10:55:40.665448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a39-account-create-clqqg" event={"ID":"e37e1fe6-6e89-4407-a40f-cf494a35eccd","Type":"ContainerDied","Data":"5bb2b37cb0135387058a9a5780b75d200e3d6755f1f8d0b825a04e68757b6f14"} Oct 11 10:55:40.665490 master-0 kubenswrapper[4790]: I1011 10:55:40.665474 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb2b37cb0135387058a9a5780b75d200e3d6755f1f8d0b825a04e68757b6f14" Oct 11 10:55:40.678845 master-0 kubenswrapper[4790]: I1011 10:55:40.678186 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptnz2\" (UniqueName: \"kubernetes.io/projected/e37e1fe6-6e89-4407-a40f-cf494a35eccd-kube-api-access-ptnz2\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:40.772863 master-0 kubenswrapper[4790]: I1011 10:55:40.772127 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:40.772863 master-0 kubenswrapper[4790]: I1011 10:55:40.772192 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:40.774475 master-0 kubenswrapper[4790]: I1011 10:55:40.773496 4790 scope.go:117] "RemoveContainer" containerID="471d5706c589d3886adf86b8614c32749579ddcf4ce9dfdf7d21941669c457bf" Oct 11 10:55:40.774745 master-0 kubenswrapper[4790]: E1011 10:55:40.774660 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-89f9b4488-8vvt9_openstack(90ca8fc6-bc53-461b-8384-ca8344e8abb1)\"" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" Oct 11 10:55:41.671923 master-0 kubenswrapper[4790]: I1011 10:55:41.670498 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-d55d46749-qq6mv" event={"ID":"e4070c53-33f0-488e-80d1-f374f59c96cd","Type":"ContainerStarted","Data":"9ccb660b18f89ed1e9ba902f7cad76c821973aa301bd218f890f40e09498e3e1"} Oct 11 10:55:41.673914 master-0 kubenswrapper[4790]: I1011 10:55:41.673184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" event={"ID":"24482e3e-ba4c-4920-90d4-077df9a7b329","Type":"ContainerStarted","Data":"a82b900b3e700703b6be815ee2a727d550189498d95c2ec3c75393c953c8afe0"} Oct 11 10:55:41.731303 master-0 kubenswrapper[4790]: I1011 10:55:41.731218 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" podStartSLOduration=5.7311921869999995 podStartE2EDuration="5.731192187s" podCreationTimestamp="2025-10-11 10:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:55:41.73022681 +0000 UTC m=+1018.284687102" watchObservedRunningTime="2025-10-11 10:55:41.731192187 +0000 UTC m=+1018.285652479" Oct 11 10:55:42.101729 master-0 kubenswrapper[4790]: I1011 10:55:42.101257 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:42.104605 master-0 kubenswrapper[4790]: I1011 10:55:42.103726 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" podUID="24482e3e-ba4c-4920-90d4-077df9a7b329" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.130.0.106:8080/\": dial tcp 10.130.0.106:8080: connect: connection refused" Oct 11 10:55:42.695737 master-0 kubenswrapper[4790]: I1011 10:55:42.694549 4790 generic.go:334] "Generic (PLEG): container finished" podID="e4070c53-33f0-488e-80d1-f374f59c96cd" containerID="9ccb660b18f89ed1e9ba902f7cad76c821973aa301bd218f890f40e09498e3e1" exitCode=0 Oct 11 10:55:42.697373 master-0 kubenswrapper[4790]: I1011 10:55:42.696576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-d55d46749-qq6mv" event={"ID":"e4070c53-33f0-488e-80d1-f374f59c96cd","Type":"ContainerDied","Data":"9ccb660b18f89ed1e9ba902f7cad76c821973aa301bd218f890f40e09498e3e1"} Oct 11 10:55:42.697373 master-0 kubenswrapper[4790]: I1011 10:55:42.696617 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-d55d46749-qq6mv" event={"ID":"e4070c53-33f0-488e-80d1-f374f59c96cd","Type":"ContainerStarted","Data":"3962fee214b48290f3b4d7d88b18ff2a4ea8e104b622cbb5acbf312b3eaf73e0"} Oct 11 10:55:42.697373 master-0 kubenswrapper[4790]: I1011 10:55:42.696631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-d55d46749-qq6mv" event={"ID":"e4070c53-33f0-488e-80d1-f374f59c96cd","Type":"ContainerStarted","Data":"379173187d53388fb3d2bdca3c2f022929893aa2034fbc1ca1133cd2d76c6fc5"} Oct 11 10:55:42.697373 master-0 kubenswrapper[4790]: I1011 10:55:42.696668 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:42.734743 master-0 kubenswrapper[4790]: I1011 10:55:42.733933 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-d55d46749-qq6mv" podStartSLOduration=3.280811494 podStartE2EDuration="14.733886778s" podCreationTimestamp="2025-10-11 10:55:28 +0000 UTC" firstStartedPulling="2025-10-11 10:55:29.602099608 +0000 UTC m=+1006.156559900" lastFinishedPulling="2025-10-11 10:55:41.055174892 +0000 UTC m=+1017.609635184" observedRunningTime="2025-10-11 10:55:42.730343941 +0000 UTC m=+1019.284804233" watchObservedRunningTime="2025-10-11 10:55:42.733886778 +0000 UTC m=+1019.288347080" Oct 11 10:55:43.708888 master-0 kubenswrapper[4790]: I1011 10:55:43.706865 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerID="c15b47f90fa6f6a74e4ba2cf37c1852384ef4ec1575e5ae98a981dfe2894d2d7" exitCode=0 Oct 11 10:55:43.708888 master-0 kubenswrapper[4790]: I1011 10:55:43.706968 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"a0a5aa40-0146-4b81-83dd-761d514c557a","Type":"ContainerDied","Data":"c15b47f90fa6f6a74e4ba2cf37c1852384ef4ec1575e5ae98a981dfe2894d2d7"} Oct 11 10:55:44.089566 master-0 kubenswrapper[4790]: I1011 10:55:44.089519 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:44.123409 master-0 kubenswrapper[4790]: I1011 10:55:44.122487 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-89f9b4488-8vvt9"] Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-logs\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-httpd-run\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186394 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vczqt\" (UniqueName: \"kubernetes.io/projected/a0a5aa40-0146-4b81-83dd-761d514c557a-kube-api-access-vczqt\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186427 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-combined-ca-bundle\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186472 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-config-data\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186786 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-internal-tls-certs\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188423 master-0 kubenswrapper[4790]: I1011 10:55:44.186935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-scripts\") pod \"a0a5aa40-0146-4b81-83dd-761d514c557a\" (UID: \"a0a5aa40-0146-4b81-83dd-761d514c557a\") " Oct 11 10:55:44.188929 master-0 kubenswrapper[4790]: I1011 10:55:44.188651 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-logs" (OuterVolumeSpecName: "logs") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:55:44.191075 master-0 kubenswrapper[4790]: I1011 10:55:44.189260 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:55:44.191933 master-0 kubenswrapper[4790]: I1011 10:55:44.191795 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a5aa40-0146-4b81-83dd-761d514c557a-kube-api-access-vczqt" (OuterVolumeSpecName: "kube-api-access-vczqt") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "kube-api-access-vczqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:44.192441 master-0 kubenswrapper[4790]: I1011 10:55:44.192351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-scripts" (OuterVolumeSpecName: "scripts") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.215888 master-0 kubenswrapper[4790]: I1011 10:55:44.215824 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.245678 master-0 kubenswrapper[4790]: I1011 10:55:44.245611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.254751 master-0 kubenswrapper[4790]: I1011 10:55:44.252858 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-config-data" (OuterVolumeSpecName: "config-data") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289274 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289318 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289329 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289341 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0a5aa40-0146-4b81-83dd-761d514c557a-httpd-run\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289353 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vczqt\" (UniqueName: \"kubernetes.io/projected/a0a5aa40-0146-4b81-83dd-761d514c557a-kube-api-access-vczqt\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289375 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.289390 master-0 kubenswrapper[4790]: I1011 10:55:44.289386 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5aa40-0146-4b81-83dd-761d514c557a-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.309173 master-0 kubenswrapper[4790]: I1011 10:55:44.309126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39" (OuterVolumeSpecName: "glance") pod "a0a5aa40-0146-4b81-83dd-761d514c557a" (UID: "a0a5aa40-0146-4b81-83dd-761d514c557a"). InnerVolumeSpecName "pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 10:55:44.393387 master-0 kubenswrapper[4790]: I1011 10:55:44.393336 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") on node \"master-0\" " Oct 11 10:55:44.427417 master-0 kubenswrapper[4790]: I1011 10:55:44.427374 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 11 10:55:44.427679 master-0 kubenswrapper[4790]: I1011 10:55:44.427653 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5" (UniqueName: "kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39") on node "master-0" Oct 11 10:55:44.496644 master-0 kubenswrapper[4790]: I1011 10:55:44.496477 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.635113 master-0 kubenswrapper[4790]: I1011 10:55:44.635073 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:44.700379 master-0 kubenswrapper[4790]: I1011 10:55:44.700303 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-combined-ca-bundle\") pod \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " Oct 11 10:55:44.700645 master-0 kubenswrapper[4790]: I1011 10:55:44.700460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxk4p\" (UniqueName: \"kubernetes.io/projected/90ca8fc6-bc53-461b-8384-ca8344e8abb1-kube-api-access-lxk4p\") pod \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " Oct 11 10:55:44.700645 master-0 kubenswrapper[4790]: I1011 10:55:44.700533 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data\") pod \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " Oct 11 10:55:44.700782 master-0 kubenswrapper[4790]: I1011 10:55:44.700698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data-custom\") pod \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\" (UID: \"90ca8fc6-bc53-461b-8384-ca8344e8abb1\") " Oct 11 10:55:44.705327 master-0 kubenswrapper[4790]: I1011 10:55:44.705294 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90ca8fc6-bc53-461b-8384-ca8344e8abb1" (UID: "90ca8fc6-bc53-461b-8384-ca8344e8abb1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.713146 master-0 kubenswrapper[4790]: I1011 10:55:44.713067 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ca8fc6-bc53-461b-8384-ca8344e8abb1-kube-api-access-lxk4p" (OuterVolumeSpecName: "kube-api-access-lxk4p") pod "90ca8fc6-bc53-461b-8384-ca8344e8abb1" (UID: "90ca8fc6-bc53-461b-8384-ca8344e8abb1"). InnerVolumeSpecName "kube-api-access-lxk4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:44.763237 master-0 kubenswrapper[4790]: I1011 10:55:44.763041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90ca8fc6-bc53-461b-8384-ca8344e8abb1" (UID: "90ca8fc6-bc53-461b-8384-ca8344e8abb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.804754 master-0 kubenswrapper[4790]: I1011 10:55:44.788376 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"a0a5aa40-0146-4b81-83dd-761d514c557a","Type":"ContainerDied","Data":"a7f963bb2db0e5602164d87267bd8833838260ebb96c568d5c14b7ec412ff1e4"} Oct 11 10:55:44.804754 master-0 kubenswrapper[4790]: I1011 10:55:44.788450 4790 scope.go:117] "RemoveContainer" containerID="c15b47f90fa6f6a74e4ba2cf37c1852384ef4ec1575e5ae98a981dfe2894d2d7" Oct 11 10:55:44.804754 master-0 kubenswrapper[4790]: I1011 10:55:44.788606 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:44.804754 master-0 kubenswrapper[4790]: I1011 10:55:44.802307 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data-custom\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.804754 master-0 kubenswrapper[4790]: I1011 10:55:44.802338 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.804754 master-0 kubenswrapper[4790]: I1011 10:55:44.802350 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxk4p\" (UniqueName: \"kubernetes.io/projected/90ca8fc6-bc53-461b-8384-ca8344e8abb1-kube-api-access-lxk4p\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.827628 master-0 kubenswrapper[4790]: I1011 10:55:44.818404 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" Oct 11 10:55:44.827628 master-0 kubenswrapper[4790]: I1011 10:55:44.818872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-89f9b4488-8vvt9" event={"ID":"90ca8fc6-bc53-461b-8384-ca8344e8abb1","Type":"ContainerDied","Data":"2889f39d4725531c10441fa9236d4ba817fb73083c92ada0288c6f7dfdb54987"} Oct 11 10:55:44.832906 master-0 kubenswrapper[4790]: I1011 10:55:44.828551 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data" (OuterVolumeSpecName: "config-data") pod "90ca8fc6-bc53-461b-8384-ca8344e8abb1" (UID: "90ca8fc6-bc53-461b-8384-ca8344e8abb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:44.870747 master-0 kubenswrapper[4790]: I1011 10:55:44.870076 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:55:44.880314 master-0 kubenswrapper[4790]: I1011 10:55:44.880243 4790 scope.go:117] "RemoveContainer" containerID="373e10b5f3797c10bf2f289215e447ba6ede87d322784f4e67d64c872023c089" Oct 11 10:55:44.889670 master-0 kubenswrapper[4790]: I1011 10:55:44.889623 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:55:44.905350 master-0 kubenswrapper[4790]: I1011 10:55:44.905315 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ca8fc6-bc53-461b-8384-ca8344e8abb1-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:44.926291 master-0 kubenswrapper[4790]: I1011 10:55:44.925889 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:55:44.926291 master-0 kubenswrapper[4790]: E1011 10:55:44.926294 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerName="heat-cfnapi" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926309 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerName="heat-cfnapi" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: E1011 10:55:44.926333 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37e1fe6-6e89-4407-a40f-cf494a35eccd" containerName="mariadb-account-create" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926340 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37e1fe6-6e89-4407-a40f-cf494a35eccd" containerName="mariadb-account-create" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: E1011 10:55:44.926347 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerName="heat-cfnapi" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926354 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerName="heat-cfnapi" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: E1011 10:55:44.926381 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41b60ff-457d-4c45-8b56-4523c5c0097f" containerName="heat-api" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926386 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41b60ff-457d-4c45-8b56-4523c5c0097f" containerName="heat-api" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: E1011 10:55:44.926393 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-log" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926399 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-log" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: E1011 10:55:44.926412 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-httpd" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926418 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-httpd" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926548 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-log" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926557 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" containerName="glance-httpd" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926566 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerName="heat-cfnapi" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926574 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41b60ff-457d-4c45-8b56-4523c5c0097f" containerName="heat-api" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926595 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" containerName="heat-cfnapi" Oct 11 10:55:44.926726 master-0 kubenswrapper[4790]: I1011 10:55:44.926611 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37e1fe6-6e89-4407-a40f-cf494a35eccd" containerName="mariadb-account-create" Oct 11 10:55:44.944159 master-0 kubenswrapper[4790]: I1011 10:55:44.944095 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:44.957973 master-0 kubenswrapper[4790]: I1011 10:55:44.953599 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 11 10:55:44.957973 master-0 kubenswrapper[4790]: I1011 10:55:44.954402 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b5802-default-internal-config-data" Oct 11 10:55:44.957973 master-0 kubenswrapper[4790]: I1011 10:55:44.954592 4790 scope.go:117] "RemoveContainer" containerID="471d5706c589d3886adf86b8614c32749579ddcf4ce9dfdf7d21941669c457bf" Oct 11 10:55:44.978735 master-0 kubenswrapper[4790]: I1011 10:55:44.972002 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-combined-ca-bundle\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012412 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-scripts\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012453 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/243e93fe-e6cd-47af-95ed-3f141cb74deb-httpd-run\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012492 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-config-data\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012534 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-internal-tls-certs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243e93fe-e6cd-47af-95ed-3f141cb74deb-logs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012597 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.013522 master-0 kubenswrapper[4790]: I1011 10:55:45.012626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4lgt\" (UniqueName: \"kubernetes.io/projected/243e93fe-e6cd-47af-95ed-3f141cb74deb-kube-api-access-n4lgt\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.115048 master-0 kubenswrapper[4790]: I1011 10:55:45.114982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.115195 master-0 kubenswrapper[4790]: I1011 10:55:45.115093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4lgt\" (UniqueName: \"kubernetes.io/projected/243e93fe-e6cd-47af-95ed-3f141cb74deb-kube-api-access-n4lgt\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.115195 master-0 kubenswrapper[4790]: I1011 10:55:45.115138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-combined-ca-bundle\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.115297 master-0 kubenswrapper[4790]: I1011 10:55:45.115213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-scripts\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.116034 master-0 kubenswrapper[4790]: I1011 10:55:45.116000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/243e93fe-e6cd-47af-95ed-3f141cb74deb-httpd-run\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.116093 master-0 kubenswrapper[4790]: I1011 10:55:45.116051 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-config-data\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.116093 master-0 kubenswrapper[4790]: I1011 10:55:45.116081 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-internal-tls-certs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.116184 master-0 kubenswrapper[4790]: I1011 10:55:45.116130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243e93fe-e6cd-47af-95ed-3f141cb74deb-logs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.116687 master-0 kubenswrapper[4790]: I1011 10:55:45.116635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/243e93fe-e6cd-47af-95ed-3f141cb74deb-httpd-run\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.116752 master-0 kubenswrapper[4790]: I1011 10:55:45.116655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/243e93fe-e6cd-47af-95ed-3f141cb74deb-logs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.117602 master-0 kubenswrapper[4790]: I1011 10:55:45.117570 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:55:45.117668 master-0 kubenswrapper[4790]: I1011 10:55:45.117601 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/b0c7c7eacbecbf6beec44181cd1a14327b215e622b505cc0fbc4653c9c57c6ce/globalmount\"" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.118658 master-0 kubenswrapper[4790]: I1011 10:55:45.118590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-combined-ca-bundle\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.119701 master-0 kubenswrapper[4790]: I1011 10:55:45.119645 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-scripts\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.131949 master-0 kubenswrapper[4790]: I1011 10:55:45.131914 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-internal-tls-certs\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.134091 master-0 kubenswrapper[4790]: I1011 10:55:45.134058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243e93fe-e6cd-47af-95ed-3f141cb74deb-config-data\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.141668 master-0 kubenswrapper[4790]: I1011 10:55:45.141617 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4lgt\" (UniqueName: \"kubernetes.io/projected/243e93fe-e6cd-47af-95ed-3f141cb74deb-kube-api-access-n4lgt\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:45.166236 master-0 kubenswrapper[4790]: I1011 10:55:45.166175 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-89f9b4488-8vvt9"] Oct 11 10:55:45.173009 master-0 kubenswrapper[4790]: I1011 10:55:45.172281 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-89f9b4488-8vvt9"] Oct 11 10:55:45.982732 master-0 kubenswrapper[4790]: I1011 10:55:45.982194 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-db4348ad-f9a3-4500-a9f3-efd0f3ffb9a5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8217a7c0-7507-4763-8106-d527748a7b39\") pod \"glance-b5802-default-internal-api-2\" (UID: \"243e93fe-e6cd-47af-95ed-3f141cb74deb\") " pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:46.304470 master-0 kubenswrapper[4790]: I1011 10:55:46.304388 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90ca8fc6-bc53-461b-8384-ca8344e8abb1" path="/var/lib/kubelet/pods/90ca8fc6-bc53-461b-8384-ca8344e8abb1/volumes" Oct 11 10:55:46.305061 master-0 kubenswrapper[4790]: I1011 10:55:46.305032 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a5aa40-0146-4b81-83dd-761d514c557a" path="/var/lib/kubelet/pods/a0a5aa40-0146-4b81-83dd-761d514c557a/volumes" Oct 11 10:55:46.502952 master-0 kubenswrapper[4790]: I1011 10:55:46.502882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:46.949283 master-0 kubenswrapper[4790]: I1011 10:55:46.949170 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:55:47.391735 master-0 kubenswrapper[4790]: I1011 10:55:47.391650 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b5802-volume-lvm-iscsi-0" Oct 11 10:55:47.885970 master-0 kubenswrapper[4790]: I1011 10:55:47.885903 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-internal-api-2"] Oct 11 10:55:48.882857 master-0 kubenswrapper[4790]: I1011 10:55:48.881060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"243e93fe-e6cd-47af-95ed-3f141cb74deb","Type":"ContainerStarted","Data":"533e9496b96514b5451c5a79e329830071f463573e7952e498684538988b8ca9"} Oct 11 10:55:48.882857 master-0 kubenswrapper[4790]: I1011 10:55:48.881145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"243e93fe-e6cd-47af-95ed-3f141cb74deb","Type":"ContainerStarted","Data":"b16754812669831b9d867df7082d7952853f0eafc7d573fa6d85c8c76b0a335d"} Oct 11 10:55:49.894653 master-0 kubenswrapper[4790]: I1011 10:55:49.894429 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-internal-api-2" event={"ID":"243e93fe-e6cd-47af-95ed-3f141cb74deb","Type":"ContainerStarted","Data":"e7459295cac3cf27355164ee90104191dcd8d154f17d3647741ac0398a118224"} Oct 11 10:55:50.021111 master-0 kubenswrapper[4790]: I1011 10:55:50.020979 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b5802-default-internal-api-2" podStartSLOduration=6.020947333 podStartE2EDuration="6.020947333s" podCreationTimestamp="2025-10-11 10:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:55:50.014093775 +0000 UTC m=+1026.568554077" watchObservedRunningTime="2025-10-11 10:55:50.020947333 +0000 UTC m=+1026.575407635" Oct 11 10:55:50.542413 master-0 kubenswrapper[4790]: I1011 10:55:50.542317 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-d55d46749-qq6mv" Oct 11 10:55:52.702062 master-0 kubenswrapper[4790]: I1011 10:55:52.701814 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:52.702600 master-0 kubenswrapper[4790]: I1011 10:55:52.702254 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b5802-api-1" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-b5802-api-log" containerID="cri-o://60be7f28501f150423ff01b582dcc8cfc5bab82c1c195c3a36e327493870a191" gracePeriod=30 Oct 11 10:55:52.702600 master-0 kubenswrapper[4790]: I1011 10:55:52.702370 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b5802-api-1" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-api" containerID="cri-o://f0083e208a574369bee7ac5b619444ddb040559a7e6adc8699675b4802ed4ee1" gracePeriod=30 Oct 11 10:55:52.927915 master-0 kubenswrapper[4790]: I1011 10:55:52.927857 4790 generic.go:334] "Generic (PLEG): container finished" podID="30c351cb-246a-4343-a56d-c74fb4be119e" containerID="60be7f28501f150423ff01b582dcc8cfc5bab82c1c195c3a36e327493870a191" exitCode=143 Oct 11 10:55:52.927915 master-0 kubenswrapper[4790]: I1011 10:55:52.927925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"30c351cb-246a-4343-a56d-c74fb4be119e","Type":"ContainerDied","Data":"60be7f28501f150423ff01b582dcc8cfc5bab82c1c195c3a36e327493870a191"} Oct 11 10:55:55.809528 master-0 kubenswrapper[4790]: I1011 10:55:55.809432 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-b5802-api-1" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-api" probeResult="failure" output="Get \"http://10.130.0.96:8776/healthcheck\": read tcp 10.130.0.2:37106->10.130.0.96:8776: read: connection reset by peer" Oct 11 10:55:55.979183 master-0 kubenswrapper[4790]: I1011 10:55:55.978045 4790 generic.go:334] "Generic (PLEG): container finished" podID="30c351cb-246a-4343-a56d-c74fb4be119e" containerID="f0083e208a574369bee7ac5b619444ddb040559a7e6adc8699675b4802ed4ee1" exitCode=0 Oct 11 10:55:55.979183 master-0 kubenswrapper[4790]: I1011 10:55:55.978109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"30c351cb-246a-4343-a56d-c74fb4be119e","Type":"ContainerDied","Data":"f0083e208a574369bee7ac5b619444ddb040559a7e6adc8699675b4802ed4ee1"} Oct 11 10:55:56.308097 master-0 kubenswrapper[4790]: I1011 10:55:56.307906 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:56.372515 master-0 kubenswrapper[4790]: I1011 10:55:56.372442 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30c351cb-246a-4343-a56d-c74fb4be119e-etc-machine-id\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.372866 master-0 kubenswrapper[4790]: I1011 10:55:56.372617 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-scripts\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.372866 master-0 kubenswrapper[4790]: I1011 10:55:56.372647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.372866 master-0 kubenswrapper[4790]: I1011 10:55:56.372651 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30c351cb-246a-4343-a56d-c74fb4be119e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 10:55:56.373430 master-0 kubenswrapper[4790]: I1011 10:55:56.373351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data-custom\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.373660 master-0 kubenswrapper[4790]: I1011 10:55:56.373634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xghz\" (UniqueName: \"kubernetes.io/projected/30c351cb-246a-4343-a56d-c74fb4be119e-kube-api-access-5xghz\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.375551 master-0 kubenswrapper[4790]: I1011 10:55:56.374007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c351cb-246a-4343-a56d-c74fb4be119e-logs\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.375551 master-0 kubenswrapper[4790]: I1011 10:55:56.374121 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-combined-ca-bundle\") pod \"30c351cb-246a-4343-a56d-c74fb4be119e\" (UID: \"30c351cb-246a-4343-a56d-c74fb4be119e\") " Oct 11 10:55:56.376662 master-0 kubenswrapper[4790]: I1011 10:55:56.376613 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c351cb-246a-4343-a56d-c74fb4be119e-logs" (OuterVolumeSpecName: "logs") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:55:56.376800 master-0 kubenswrapper[4790]: I1011 10:55:56.376746 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-scripts" (OuterVolumeSpecName: "scripts") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:56.378980 master-0 kubenswrapper[4790]: I1011 10:55:56.378947 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c351cb-246a-4343-a56d-c74fb4be119e-kube-api-access-5xghz" (OuterVolumeSpecName: "kube-api-access-5xghz") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "kube-api-access-5xghz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:55:56.382401 master-0 kubenswrapper[4790]: I1011 10:55:56.381818 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:56.393220 master-0 kubenswrapper[4790]: I1011 10:55:56.392993 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30c351cb-246a-4343-a56d-c74fb4be119e-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.393220 master-0 kubenswrapper[4790]: I1011 10:55:56.393031 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.393220 master-0 kubenswrapper[4790]: I1011 10:55:56.393041 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data-custom\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.393220 master-0 kubenswrapper[4790]: I1011 10:55:56.393053 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xghz\" (UniqueName: \"kubernetes.io/projected/30c351cb-246a-4343-a56d-c74fb4be119e-kube-api-access-5xghz\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.393220 master-0 kubenswrapper[4790]: I1011 10:55:56.393064 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c351cb-246a-4343-a56d-c74fb4be119e-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.412354 master-0 kubenswrapper[4790]: I1011 10:55:56.412267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:56.425728 master-0 kubenswrapper[4790]: I1011 10:55:56.425643 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data" (OuterVolumeSpecName: "config-data") pod "30c351cb-246a-4343-a56d-c74fb4be119e" (UID: "30c351cb-246a-4343-a56d-c74fb4be119e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:55:56.494492 master-0 kubenswrapper[4790]: I1011 10:55:56.494368 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.494492 master-0 kubenswrapper[4790]: I1011 10:55:56.494411 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c351cb-246a-4343-a56d-c74fb4be119e-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:55:56.503916 master-0 kubenswrapper[4790]: I1011 10:55:56.503864 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:56.503916 master-0 kubenswrapper[4790]: I1011 10:55:56.503919 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:56.538045 master-0 kubenswrapper[4790]: I1011 10:55:56.537969 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:56.550306 master-0 kubenswrapper[4790]: I1011 10:55:56.550240 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:56.988405 master-0 kubenswrapper[4790]: I1011 10:55:56.988345 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:56.997502 master-0 kubenswrapper[4790]: I1011 10:55:56.997411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"30c351cb-246a-4343-a56d-c74fb4be119e","Type":"ContainerDied","Data":"72c3b882472f96be141ddefa786998e8b0390cd596d77062abb0fcaa4a2d580f"} Oct 11 10:55:56.997502 master-0 kubenswrapper[4790]: I1011 10:55:56.997464 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:56.997502 master-0 kubenswrapper[4790]: I1011 10:55:56.997483 4790 scope.go:117] "RemoveContainer" containerID="f0083e208a574369bee7ac5b619444ddb040559a7e6adc8699675b4802ed4ee1" Oct 11 10:55:56.998303 master-0 kubenswrapper[4790]: I1011 10:55:56.998271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:57.022828 master-0 kubenswrapper[4790]: I1011 10:55:57.022771 4790 scope.go:117] "RemoveContainer" containerID="60be7f28501f150423ff01b582dcc8cfc5bab82c1c195c3a36e327493870a191" Oct 11 10:55:57.036119 master-0 kubenswrapper[4790]: I1011 10:55:57.035979 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:57.066969 master-0 kubenswrapper[4790]: I1011 10:55:57.066885 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:57.086449 master-0 kubenswrapper[4790]: I1011 10:55:57.086333 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:57.087776 master-0 kubenswrapper[4790]: E1011 10:55:57.086804 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-api" Oct 11 10:55:57.087776 master-0 kubenswrapper[4790]: I1011 10:55:57.086829 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-api" Oct 11 10:55:57.087776 master-0 kubenswrapper[4790]: E1011 10:55:57.086853 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-b5802-api-log" Oct 11 10:55:57.087776 master-0 kubenswrapper[4790]: I1011 10:55:57.087128 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-b5802-api-log" Oct 11 10:55:57.090019 master-0 kubenswrapper[4790]: I1011 10:55:57.088169 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-api" Oct 11 10:55:57.090019 master-0 kubenswrapper[4790]: I1011 10:55:57.088226 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" containerName="cinder-b5802-api-log" Oct 11 10:55:57.090152 master-0 kubenswrapper[4790]: I1011 10:55:57.090119 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.093731 master-0 kubenswrapper[4790]: I1011 10:55:57.093677 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b5802-api-config-data" Oct 11 10:55:57.093979 master-0 kubenswrapper[4790]: I1011 10:55:57.093939 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 11 10:55:57.094661 master-0 kubenswrapper[4790]: I1011 10:55:57.094294 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 11 10:55:57.096681 master-0 kubenswrapper[4790]: I1011 10:55:57.096611 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.207911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-internal-tls-certs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-scripts\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208209 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b61e18b-6985-48e4-96d6-880b0c497e66-etc-machine-id\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-public-tls-certs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vf2\" (UniqueName: \"kubernetes.io/projected/2b61e18b-6985-48e4-96d6-880b0c497e66-kube-api-access-z9vf2\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-config-data-custom\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208493 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-config-data\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208525 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b61e18b-6985-48e4-96d6-880b0c497e66-logs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.210289 master-0 kubenswrapper[4790]: I1011 10:55:57.208589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-combined-ca-bundle\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311231 master-0 kubenswrapper[4790]: I1011 10:55:57.311131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-internal-tls-certs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-scripts\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311304 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b61e18b-6985-48e4-96d6-880b0c497e66-etc-machine-id\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-public-tls-certs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311378 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vf2\" (UniqueName: \"kubernetes.io/projected/2b61e18b-6985-48e4-96d6-880b0c497e66-kube-api-access-z9vf2\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-config-data-custom\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311447 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-config-data\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b61e18b-6985-48e4-96d6-880b0c497e66-logs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.311553 master-0 kubenswrapper[4790]: I1011 10:55:57.311485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-combined-ca-bundle\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.314832 master-0 kubenswrapper[4790]: I1011 10:55:57.314504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b61e18b-6985-48e4-96d6-880b0c497e66-etc-machine-id\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.317057 master-0 kubenswrapper[4790]: I1011 10:55:57.316947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b61e18b-6985-48e4-96d6-880b0c497e66-logs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.317900 master-0 kubenswrapper[4790]: I1011 10:55:57.317853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-combined-ca-bundle\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.318106 master-0 kubenswrapper[4790]: I1011 10:55:57.318057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-internal-tls-certs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.318243 master-0 kubenswrapper[4790]: I1011 10:55:57.318184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-config-data\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.318950 master-0 kubenswrapper[4790]: I1011 10:55:57.318920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-config-data-custom\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.319588 master-0 kubenswrapper[4790]: I1011 10:55:57.319547 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-scripts\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.323055 master-0 kubenswrapper[4790]: I1011 10:55:57.322186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b61e18b-6985-48e4-96d6-880b0c497e66-public-tls-certs\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.334446 master-0 kubenswrapper[4790]: I1011 10:55:57.334028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vf2\" (UniqueName: \"kubernetes.io/projected/2b61e18b-6985-48e4-96d6-880b0c497e66-kube-api-access-z9vf2\") pod \"cinder-b5802-api-1\" (UID: \"2b61e18b-6985-48e4-96d6-880b0c497e66\") " pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.465486 master-0 kubenswrapper[4790]: I1011 10:55:57.465407 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5802-api-1" Oct 11 10:55:57.935511 master-0 kubenswrapper[4790]: W1011 10:55:57.935444 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b61e18b_6985_48e4_96d6_880b0c497e66.slice/crio-d4f8b9fb6cd0677010930ff89f8578c0b2a6fc98f1343e529253bcbfc7307e63 WatchSource:0}: Error finding container d4f8b9fb6cd0677010930ff89f8578c0b2a6fc98f1343e529253bcbfc7307e63: Status 404 returned error can't find the container with id d4f8b9fb6cd0677010930ff89f8578c0b2a6fc98f1343e529253bcbfc7307e63 Oct 11 10:55:57.941969 master-0 kubenswrapper[4790]: I1011 10:55:57.941888 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5802-api-1"] Oct 11 10:55:58.011644 master-0 kubenswrapper[4790]: I1011 10:55:58.011578 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"2b61e18b-6985-48e4-96d6-880b0c497e66","Type":"ContainerStarted","Data":"d4f8b9fb6cd0677010930ff89f8578c0b2a6fc98f1343e529253bcbfc7307e63"} Oct 11 10:55:58.308513 master-0 kubenswrapper[4790]: I1011 10:55:58.308457 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c351cb-246a-4343-a56d-c74fb4be119e" path="/var/lib/kubelet/pods/30c351cb-246a-4343-a56d-c74fb4be119e/volumes" Oct 11 10:55:58.481095 master-0 kubenswrapper[4790]: I1011 10:55:58.481046 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:55:58.481410 master-0 kubenswrapper[4790]: I1011 10:55:58.481378 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b5802-default-external-api-1" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-log" containerID="cri-o://1980b844ffaf3f4ae914e07f575c0c012ae970754f7fcb5bc4e59940912228a6" gracePeriod=30 Oct 11 10:55:58.481579 master-0 kubenswrapper[4790]: I1011 10:55:58.481539 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b5802-default-external-api-1" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-httpd" containerID="cri-o://b351fbb5c1ba1d71d14296c2d406aca09cc6016d89d1300e0ba6b0d00727939f" gracePeriod=30 Oct 11 10:55:59.034068 master-0 kubenswrapper[4790]: I1011 10:55:59.034009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"2b61e18b-6985-48e4-96d6-880b0c497e66","Type":"ContainerStarted","Data":"60172d0913fc2a88f9aab061db4aca93de54ff4b7e2c4c948541249d981cee3f"} Oct 11 10:55:59.040991 master-0 kubenswrapper[4790]: I1011 10:55:59.040842 4790 generic.go:334] "Generic (PLEG): container finished" podID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerID="1980b844ffaf3f4ae914e07f575c0c012ae970754f7fcb5bc4e59940912228a6" exitCode=143 Oct 11 10:55:59.040991 master-0 kubenswrapper[4790]: I1011 10:55:59.040935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"3166d4fc-8488-46dc-9d63-87dc403f66bc","Type":"ContainerDied","Data":"1980b844ffaf3f4ae914e07f575c0c012ae970754f7fcb5bc4e59940912228a6"} Oct 11 10:55:59.041166 master-0 kubenswrapper[4790]: I1011 10:55:59.041001 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 10:55:59.041166 master-0 kubenswrapper[4790]: I1011 10:55:59.041013 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 10:55:59.236434 master-0 kubenswrapper[4790]: I1011 10:55:59.236377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:55:59.236847 master-0 kubenswrapper[4790]: I1011 10:55:59.236809 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-internal-api-2" Oct 11 10:56:00.057982 master-0 kubenswrapper[4790]: I1011 10:56:00.057881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5802-api-1" event={"ID":"2b61e18b-6985-48e4-96d6-880b0c497e66","Type":"ContainerStarted","Data":"87fe0bfeba1f22ca991252d7932d9baf470ee6ea4292baa531184e04779f043e"} Oct 11 10:56:00.059001 master-0 kubenswrapper[4790]: I1011 10:56:00.058024 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-b5802-api-1" Oct 11 10:56:00.099841 master-0 kubenswrapper[4790]: I1011 10:56:00.099720 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b5802-api-1" podStartSLOduration=3.099681135 podStartE2EDuration="3.099681135s" podCreationTimestamp="2025-10-11 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:56:00.09404314 +0000 UTC m=+1036.648503442" watchObservedRunningTime="2025-10-11 10:56:00.099681135 +0000 UTC m=+1036.654141427" Oct 11 10:56:02.079990 master-0 kubenswrapper[4790]: I1011 10:56:02.079915 4790 generic.go:334] "Generic (PLEG): container finished" podID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerID="b351fbb5c1ba1d71d14296c2d406aca09cc6016d89d1300e0ba6b0d00727939f" exitCode=0 Oct 11 10:56:02.080591 master-0 kubenswrapper[4790]: I1011 10:56:02.080010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"3166d4fc-8488-46dc-9d63-87dc403f66bc","Type":"ContainerDied","Data":"b351fbb5c1ba1d71d14296c2d406aca09cc6016d89d1300e0ba6b0d00727939f"} Oct 11 10:56:02.197513 master-0 kubenswrapper[4790]: I1011 10:56:02.197290 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:02.323102 master-0 kubenswrapper[4790]: I1011 10:56:02.323024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-config-data\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.323550 master-0 kubenswrapper[4790]: I1011 10:56:02.323525 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpbtf\" (UniqueName: \"kubernetes.io/projected/3166d4fc-8488-46dc-9d63-87dc403f66bc-kube-api-access-vpbtf\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.323804 master-0 kubenswrapper[4790]: I1011 10:56:02.323788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-combined-ca-bundle\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.323938 master-0 kubenswrapper[4790]: I1011 10:56:02.323922 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-logs\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.324028 master-0 kubenswrapper[4790]: I1011 10:56:02.324016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-public-tls-certs\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.324151 master-0 kubenswrapper[4790]: I1011 10:56:02.324137 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-httpd-run\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.324336 master-0 kubenswrapper[4790]: I1011 10:56:02.324320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-scripts\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.324491 master-0 kubenswrapper[4790]: I1011 10:56:02.324433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-logs" (OuterVolumeSpecName: "logs") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:56:02.324647 master-0 kubenswrapper[4790]: I1011 10:56:02.324633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"3166d4fc-8488-46dc-9d63-87dc403f66bc\" (UID: \"3166d4fc-8488-46dc-9d63-87dc403f66bc\") " Oct 11 10:56:02.325074 master-0 kubenswrapper[4790]: I1011 10:56:02.325043 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:56:02.325375 master-0 kubenswrapper[4790]: I1011 10:56:02.325336 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.325455 master-0 kubenswrapper[4790]: I1011 10:56:02.325443 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3166d4fc-8488-46dc-9d63-87dc403f66bc-httpd-run\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.329813 master-0 kubenswrapper[4790]: I1011 10:56:02.329779 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3166d4fc-8488-46dc-9d63-87dc403f66bc-kube-api-access-vpbtf" (OuterVolumeSpecName: "kube-api-access-vpbtf") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "kube-api-access-vpbtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:56:02.335538 master-0 kubenswrapper[4790]: I1011 10:56:02.335435 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-scripts" (OuterVolumeSpecName: "scripts") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:02.355002 master-0 kubenswrapper[4790]: I1011 10:56:02.354915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7" (OuterVolumeSpecName: "glance") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "pvc-c7212717-18be-4287-9071-f6f818672815". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 10:56:02.360611 master-0 kubenswrapper[4790]: I1011 10:56:02.360422 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:02.369938 master-0 kubenswrapper[4790]: I1011 10:56:02.369860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-config-data" (OuterVolumeSpecName: "config-data") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:02.404738 master-0 kubenswrapper[4790]: I1011 10:56:02.404618 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3166d4fc-8488-46dc-9d63-87dc403f66bc" (UID: "3166d4fc-8488-46dc-9d63-87dc403f66bc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:02.428042 master-0 kubenswrapper[4790]: I1011 10:56:02.427988 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-scripts\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.428133 master-0 kubenswrapper[4790]: I1011 10:56:02.428075 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") on node \"master-0\" " Oct 11 10:56:02.428133 master-0 kubenswrapper[4790]: I1011 10:56:02.428098 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.428133 master-0 kubenswrapper[4790]: I1011 10:56:02.428113 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpbtf\" (UniqueName: \"kubernetes.io/projected/3166d4fc-8488-46dc-9d63-87dc403f66bc-kube-api-access-vpbtf\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.428133 master-0 kubenswrapper[4790]: I1011 10:56:02.428130 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.428311 master-0 kubenswrapper[4790]: I1011 10:56:02.428143 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3166d4fc-8488-46dc-9d63-87dc403f66bc-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:02.447102 master-0 kubenswrapper[4790]: I1011 10:56:02.447029 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 11 10:56:02.447404 master-0 kubenswrapper[4790]: I1011 10:56:02.447284 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c7212717-18be-4287-9071-f6f818672815" (UniqueName: "kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7") on node "master-0" Oct 11 10:56:02.530757 master-0 kubenswrapper[4790]: I1011 10:56:02.530544 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:03.099980 master-0 kubenswrapper[4790]: I1011 10:56:03.099825 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"3166d4fc-8488-46dc-9d63-87dc403f66bc","Type":"ContainerDied","Data":"f242a619a098bee9251349acb03ad40745b6b14dcdda08d9b62f04ce2b3b042e"} Oct 11 10:56:03.099980 master-0 kubenswrapper[4790]: I1011 10:56:03.099945 4790 scope.go:117] "RemoveContainer" containerID="b351fbb5c1ba1d71d14296c2d406aca09cc6016d89d1300e0ba6b0d00727939f" Oct 11 10:56:03.099980 master-0 kubenswrapper[4790]: I1011 10:56:03.099965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.131062 master-0 kubenswrapper[4790]: I1011 10:56:03.131002 4790 scope.go:117] "RemoveContainer" containerID="1980b844ffaf3f4ae914e07f575c0c012ae970754f7fcb5bc4e59940912228a6" Oct 11 10:56:03.152462 master-0 kubenswrapper[4790]: I1011 10:56:03.151764 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:56:03.158579 master-0 kubenswrapper[4790]: I1011 10:56:03.158515 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:56:03.183851 master-0 kubenswrapper[4790]: I1011 10:56:03.183792 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:56:03.184538 master-0 kubenswrapper[4790]: E1011 10:56:03.184521 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-log" Oct 11 10:56:03.184616 master-0 kubenswrapper[4790]: I1011 10:56:03.184606 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-log" Oct 11 10:56:03.184680 master-0 kubenswrapper[4790]: E1011 10:56:03.184670 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-httpd" Oct 11 10:56:03.184756 master-0 kubenswrapper[4790]: I1011 10:56:03.184744 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-httpd" Oct 11 10:56:03.185015 master-0 kubenswrapper[4790]: I1011 10:56:03.184998 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-log" Oct 11 10:56:03.185101 master-0 kubenswrapper[4790]: I1011 10:56:03.185091 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" containerName="glance-httpd" Oct 11 10:56:03.186184 master-0 kubenswrapper[4790]: I1011 10:56:03.186167 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.189487 master-0 kubenswrapper[4790]: I1011 10:56:03.189362 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b5802-default-external-config-data" Oct 11 10:56:03.190620 master-0 kubenswrapper[4790]: I1011 10:56:03.190220 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 11 10:56:03.239410 master-0 kubenswrapper[4790]: I1011 10:56:03.239246 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:56:03.254242 master-0 kubenswrapper[4790]: I1011 10:56:03.254127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.254242 master-0 kubenswrapper[4790]: I1011 10:56:03.254203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.254566 master-0 kubenswrapper[4790]: I1011 10:56:03.254257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.254863 master-0 kubenswrapper[4790]: I1011 10:56:03.254306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkccz\" (UniqueName: \"kubernetes.io/projected/76b7c4b6-c727-4201-9627-23a06e9ae7ea-kube-api-access-fkccz\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.254997 master-0 kubenswrapper[4790]: I1011 10:56:03.254946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76b7c4b6-c727-4201-9627-23a06e9ae7ea-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.254997 master-0 kubenswrapper[4790]: I1011 10:56:03.254991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76b7c4b6-c727-4201-9627-23a06e9ae7ea-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.255098 master-0 kubenswrapper[4790]: I1011 10:56:03.255037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.255185 master-0 kubenswrapper[4790]: I1011 10:56:03.255163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76b7c4b6-c727-4201-9627-23a06e9ae7ea-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356760 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356855 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.356860 master-0 kubenswrapper[4790]: I1011 10:56:03.356883 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkccz\" (UniqueName: \"kubernetes.io/projected/76b7c4b6-c727-4201-9627-23a06e9ae7ea-kube-api-access-fkccz\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.357382 master-0 kubenswrapper[4790]: I1011 10:56:03.356926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76b7c4b6-c727-4201-9627-23a06e9ae7ea-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.357382 master-0 kubenswrapper[4790]: I1011 10:56:03.357253 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76b7c4b6-c727-4201-9627-23a06e9ae7ea-httpd-run\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.357624 master-0 kubenswrapper[4790]: I1011 10:56:03.357593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76b7c4b6-c727-4201-9627-23a06e9ae7ea-logs\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.359894 master-0 kubenswrapper[4790]: I1011 10:56:03.359861 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 10:56:03.359978 master-0 kubenswrapper[4790]: I1011 10:56:03.359896 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/95ab3ea1c73b905e55aa0f0a1e574a5056ec96dde23978388ab58fbe89465472/globalmount\"" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.361290 master-0 kubenswrapper[4790]: I1011 10:56:03.361247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-public-tls-certs\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.362278 master-0 kubenswrapper[4790]: I1011 10:56:03.362236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-scripts\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.362665 master-0 kubenswrapper[4790]: I1011 10:56:03.362600 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-combined-ca-bundle\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.369186 master-0 kubenswrapper[4790]: I1011 10:56:03.369137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b7c4b6-c727-4201-9627-23a06e9ae7ea-config-data\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:03.377335 master-0 kubenswrapper[4790]: I1011 10:56:03.377281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkccz\" (UniqueName: \"kubernetes.io/projected/76b7c4b6-c727-4201-9627-23a06e9ae7ea-kube-api-access-fkccz\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:04.306273 master-0 kubenswrapper[4790]: I1011 10:56:04.306205 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3166d4fc-8488-46dc-9d63-87dc403f66bc" path="/var/lib/kubelet/pods/3166d4fc-8488-46dc-9d63-87dc403f66bc/volumes" Oct 11 10:56:04.834420 master-0 kubenswrapper[4790]: I1011 10:56:04.834359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7212717-18be-4287-9071-f6f818672815\" (UniqueName: \"kubernetes.io/csi/topolvm.io^adbf5277-58ef-49f6-8f5f-4a3a5350d8b7\") pod \"glance-b5802-default-external-api-1\" (UID: \"76b7c4b6-c727-4201-9627-23a06e9ae7ea\") " pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:05.041490 master-0 kubenswrapper[4790]: I1011 10:56:05.039998 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:05.658322 master-0 kubenswrapper[4790]: I1011 10:56:05.658271 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5802-default-external-api-1"] Oct 11 10:56:06.129598 master-0 kubenswrapper[4790]: I1011 10:56:06.129543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"76b7c4b6-c727-4201-9627-23a06e9ae7ea","Type":"ContainerStarted","Data":"c85f73b43bcb60791e1609387e6f01887cda94a93c1b9c636cf715e1a1e6d520"} Oct 11 10:56:07.145348 master-0 kubenswrapper[4790]: I1011 10:56:07.145269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"76b7c4b6-c727-4201-9627-23a06e9ae7ea","Type":"ContainerStarted","Data":"7cfad841733d6d01c552a9fe1d6ac9225e5aea7beee8d5703b4c33e0f1b8d4f1"} Oct 11 10:56:07.145348 master-0 kubenswrapper[4790]: I1011 10:56:07.145332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5802-default-external-api-1" event={"ID":"76b7c4b6-c727-4201-9627-23a06e9ae7ea","Type":"ContainerStarted","Data":"1f3509d59c7f9f63906f892a9d00e75ff0e459cdb1a0f40da469a10a53a0d0c7"} Oct 11 10:56:07.192835 master-0 kubenswrapper[4790]: I1011 10:56:07.192652 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b5802-default-external-api-1" podStartSLOduration=4.192624517 podStartE2EDuration="4.192624517s" podCreationTimestamp="2025-10-11 10:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:56:07.184099743 +0000 UTC m=+1043.738560075" watchObservedRunningTime="2025-10-11 10:56:07.192624517 +0000 UTC m=+1043.747084819" Oct 11 10:56:09.382805 master-0 kubenswrapper[4790]: I1011 10:56:09.382733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-b5802-api-1" Oct 11 10:56:11.910035 master-0 kubenswrapper[4790]: I1011 10:56:11.909966 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-748bbfcf89-9smpw"] Oct 11 10:56:11.912832 master-0 kubenswrapper[4790]: I1011 10:56:11.912752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.916138 master-0 kubenswrapper[4790]: I1011 10:56:11.916111 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 11 10:56:11.916342 master-0 kubenswrapper[4790]: I1011 10:56:11.916165 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 11 10:56:11.932587 master-0 kubenswrapper[4790]: I1011 10:56:11.932531 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748bbfcf89-9smpw"] Oct 11 10:56:11.946232 master-0 kubenswrapper[4790]: I1011 10:56:11.946159 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-internal-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.946232 master-0 kubenswrapper[4790]: I1011 10:56:11.946218 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-combined-ca-bundle\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.946620 master-0 kubenswrapper[4790]: I1011 10:56:11.946290 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-public-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.946620 master-0 kubenswrapper[4790]: I1011 10:56:11.946517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-ovndb-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.946897 master-0 kubenswrapper[4790]: I1011 10:56:11.946842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdpj\" (UniqueName: \"kubernetes.io/projected/bd214893-adb3-4a7f-b947-814410ab6375-kube-api-access-qjdpj\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.946979 master-0 kubenswrapper[4790]: I1011 10:56:11.946907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-config\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:11.946979 master-0 kubenswrapper[4790]: I1011 10:56:11.946957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-httpd-config\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.048814 master-0 kubenswrapper[4790]: I1011 10:56:12.048740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-internal-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.048814 master-0 kubenswrapper[4790]: I1011 10:56:12.048803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-combined-ca-bundle\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.049205 master-0 kubenswrapper[4790]: I1011 10:56:12.048857 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-public-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.049205 master-0 kubenswrapper[4790]: I1011 10:56:12.048889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-ovndb-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.049205 master-0 kubenswrapper[4790]: I1011 10:56:12.048923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdpj\" (UniqueName: \"kubernetes.io/projected/bd214893-adb3-4a7f-b947-814410ab6375-kube-api-access-qjdpj\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.049205 master-0 kubenswrapper[4790]: I1011 10:56:12.048948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-config\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.049205 master-0 kubenswrapper[4790]: I1011 10:56:12.048976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-httpd-config\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.053912 master-0 kubenswrapper[4790]: I1011 10:56:12.053853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-config\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.054467 master-0 kubenswrapper[4790]: I1011 10:56:12.054433 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-ovndb-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.055149 master-0 kubenswrapper[4790]: I1011 10:56:12.055086 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-httpd-config\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.055209 master-0 kubenswrapper[4790]: I1011 10:56:12.055151 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-public-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.056383 master-0 kubenswrapper[4790]: I1011 10:56:12.056347 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-combined-ca-bundle\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.056729 master-0 kubenswrapper[4790]: I1011 10:56:12.056660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd214893-adb3-4a7f-b947-814410ab6375-internal-tls-certs\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.094598 master-0 kubenswrapper[4790]: I1011 10:56:12.094524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdpj\" (UniqueName: \"kubernetes.io/projected/bd214893-adb3-4a7f-b947-814410ab6375-kube-api-access-qjdpj\") pod \"neutron-748bbfcf89-9smpw\" (UID: \"bd214893-adb3-4a7f-b947-814410ab6375\") " pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.255232 master-0 kubenswrapper[4790]: I1011 10:56:12.255102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:12.860850 master-0 kubenswrapper[4790]: I1011 10:56:12.860761 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748bbfcf89-9smpw"] Oct 11 10:56:12.870075 master-0 kubenswrapper[4790]: W1011 10:56:12.870013 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd214893_adb3_4a7f_b947_814410ab6375.slice/crio-1d40fed2c34b787c65133a4e33b9d39ed4fa1f38e899e85f25a46d38e7e9ca57 WatchSource:0}: Error finding container 1d40fed2c34b787c65133a4e33b9d39ed4fa1f38e899e85f25a46d38e7e9ca57: Status 404 returned error can't find the container with id 1d40fed2c34b787c65133a4e33b9d39ed4fa1f38e899e85f25a46d38e7e9ca57 Oct 11 10:56:13.219290 master-0 kubenswrapper[4790]: I1011 10:56:13.219211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748bbfcf89-9smpw" event={"ID":"bd214893-adb3-4a7f-b947-814410ab6375","Type":"ContainerStarted","Data":"a84e4701f8238a0a3ba3476b64e6b406711d9beb43f39f2ab6887bac602983d2"} Oct 11 10:56:13.219290 master-0 kubenswrapper[4790]: I1011 10:56:13.219279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748bbfcf89-9smpw" event={"ID":"bd214893-adb3-4a7f-b947-814410ab6375","Type":"ContainerStarted","Data":"1d40fed2c34b787c65133a4e33b9d39ed4fa1f38e899e85f25a46d38e7e9ca57"} Oct 11 10:56:14.234763 master-0 kubenswrapper[4790]: I1011 10:56:14.234577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748bbfcf89-9smpw" event={"ID":"bd214893-adb3-4a7f-b947-814410ab6375","Type":"ContainerStarted","Data":"d6c7ac18f7845ac5f376c39bebecfc59193f64791fdb3a9cdf0592dde370a55c"} Oct 11 10:56:14.235956 master-0 kubenswrapper[4790]: I1011 10:56:14.234962 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:15.040927 master-0 kubenswrapper[4790]: I1011 10:56:15.040841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:15.040927 master-0 kubenswrapper[4790]: I1011 10:56:15.040925 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:15.081325 master-0 kubenswrapper[4790]: I1011 10:56:15.081033 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:15.096640 master-0 kubenswrapper[4790]: I1011 10:56:15.096578 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:15.116009 master-0 kubenswrapper[4790]: I1011 10:56:15.115916 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-748bbfcf89-9smpw" podStartSLOduration=4.115894484 podStartE2EDuration="4.115894484s" podCreationTimestamp="2025-10-11 10:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:56:14.498664494 +0000 UTC m=+1051.053124786" watchObservedRunningTime="2025-10-11 10:56:15.115894484 +0000 UTC m=+1051.670354766" Oct 11 10:56:15.252700 master-0 kubenswrapper[4790]: I1011 10:56:15.252616 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:15.252700 master-0 kubenswrapper[4790]: I1011 10:56:15.252665 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:17.277949 master-0 kubenswrapper[4790]: I1011 10:56:17.274679 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:17.277949 master-0 kubenswrapper[4790]: I1011 10:56:17.275085 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b5802-default-external-api-1" Oct 11 10:56:27.237095 master-0 kubenswrapper[4790]: I1011 10:56:27.235445 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:27.249320 master-0 kubenswrapper[4790]: I1011 10:56:27.249246 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:56:27.252643 master-0 kubenswrapper[4790]: I1011 10:56:27.252593 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 11 10:56:27.252887 master-0 kubenswrapper[4790]: I1011 10:56:27.252851 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1"] Oct 11 10:56:27.255172 master-0 kubenswrapper[4790]: I1011 10:56:27.255133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:56:27.258783 master-0 kubenswrapper[4790]: I1011 10:56:27.258265 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 10:56:27.271815 master-0 kubenswrapper[4790]: I1011 10:56:27.267387 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:56:27.277107 master-0 kubenswrapper[4790]: I1011 10:56:27.276672 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-config-data\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.277107 master-0 kubenswrapper[4790]: I1011 10:56:27.276881 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6w4\" (UniqueName: \"kubernetes.io/projected/0edb0512-334f-4bfd-b297-cce29a7c510b-kube-api-access-9l6w4\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.277107 master-0 kubenswrapper[4790]: I1011 10:56:27.276985 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253852fc-de03-49f0-8e18-b3ccba3d4966-logs\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.277274 master-0 kubenswrapper[4790]: I1011 10:56:27.277073 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhcj7\" (UniqueName: \"kubernetes.io/projected/253852fc-de03-49f0-8e18-b3ccba3d4966-kube-api-access-jhcj7\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.277274 master-0 kubenswrapper[4790]: I1011 10:56:27.277157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-config-data\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.277274 master-0 kubenswrapper[4790]: I1011 10:56:27.277187 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.277361 master-0 kubenswrapper[4790]: I1011 10:56:27.277331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.279175 master-0 kubenswrapper[4790]: I1011 10:56:27.279116 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:27.379491 master-0 kubenswrapper[4790]: I1011 10:56:27.379410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-config-data\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.379788 master-0 kubenswrapper[4790]: I1011 10:56:27.379533 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6w4\" (UniqueName: \"kubernetes.io/projected/0edb0512-334f-4bfd-b297-cce29a7c510b-kube-api-access-9l6w4\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.379788 master-0 kubenswrapper[4790]: I1011 10:56:27.379577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253852fc-de03-49f0-8e18-b3ccba3d4966-logs\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.379788 master-0 kubenswrapper[4790]: I1011 10:56:27.379620 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhcj7\" (UniqueName: \"kubernetes.io/projected/253852fc-de03-49f0-8e18-b3ccba3d4966-kube-api-access-jhcj7\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.379788 master-0 kubenswrapper[4790]: I1011 10:56:27.379643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-config-data\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.379788 master-0 kubenswrapper[4790]: I1011 10:56:27.379665 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.380069 master-0 kubenswrapper[4790]: I1011 10:56:27.379801 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.380433 master-0 kubenswrapper[4790]: I1011 10:56:27.380384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253852fc-de03-49f0-8e18-b3ccba3d4966-logs\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.384078 master-0 kubenswrapper[4790]: I1011 10:56:27.384013 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.384664 master-0 kubenswrapper[4790]: I1011 10:56:27.384405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-config-data\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.388624 master-0 kubenswrapper[4790]: I1011 10:56:27.388565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.392272 master-0 kubenswrapper[4790]: I1011 10:56:27.392211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-config-data\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.405426 master-0 kubenswrapper[4790]: I1011 10:56:27.405354 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhcj7\" (UniqueName: \"kubernetes.io/projected/253852fc-de03-49f0-8e18-b3ccba3d4966-kube-api-access-jhcj7\") pod \"nova-api-1\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " pod="openstack/nova-api-1" Oct 11 10:56:27.408821 master-0 kubenswrapper[4790]: I1011 10:56:27.408760 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6w4\" (UniqueName: \"kubernetes.io/projected/0edb0512-334f-4bfd-b297-cce29a7c510b-kube-api-access-9l6w4\") pod \"nova-scheduler-2\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:27.589975 master-0 kubenswrapper[4790]: I1011 10:56:27.589176 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:56:27.603782 master-0 kubenswrapper[4790]: I1011 10:56:27.603703 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:56:27.631764 master-0 kubenswrapper[4790]: I1011 10:56:27.631688 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:27.633400 master-0 kubenswrapper[4790]: I1011 10:56:27.633375 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:27.638830 master-0 kubenswrapper[4790]: I1011 10:56:27.636783 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 10:56:27.653154 master-0 kubenswrapper[4790]: I1011 10:56:27.653084 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:27.687629 master-0 kubenswrapper[4790]: I1011 10:56:27.687193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.687629 master-0 kubenswrapper[4790]: I1011 10:56:27.687342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-config-data\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.687629 master-0 kubenswrapper[4790]: I1011 10:56:27.687397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1479006f-cac6-481e-86de-1ec1bed55c2d-logs\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.687629 master-0 kubenswrapper[4790]: I1011 10:56:27.687443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cktcw\" (UniqueName: \"kubernetes.io/projected/1479006f-cac6-481e-86de-1ec1bed55c2d-kube-api-access-cktcw\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.790629 master-0 kubenswrapper[4790]: I1011 10:56:27.790256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-config-data\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.790629 master-0 kubenswrapper[4790]: I1011 10:56:27.790343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1479006f-cac6-481e-86de-1ec1bed55c2d-logs\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.790629 master-0 kubenswrapper[4790]: I1011 10:56:27.790388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cktcw\" (UniqueName: \"kubernetes.io/projected/1479006f-cac6-481e-86de-1ec1bed55c2d-kube-api-access-cktcw\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.790629 master-0 kubenswrapper[4790]: I1011 10:56:27.790428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.791397 master-0 kubenswrapper[4790]: I1011 10:56:27.791305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1479006f-cac6-481e-86de-1ec1bed55c2d-logs\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.799678 master-0 kubenswrapper[4790]: I1011 10:56:27.798727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-config-data\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.801242 master-0 kubenswrapper[4790]: I1011 10:56:27.800550 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:27.831897 master-0 kubenswrapper[4790]: I1011 10:56:27.830435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cktcw\" (UniqueName: \"kubernetes.io/projected/1479006f-cac6-481e-86de-1ec1bed55c2d-kube-api-access-cktcw\") pod \"nova-metadata-2\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " pod="openstack/nova-metadata-2" Oct 11 10:56:28.041936 master-0 kubenswrapper[4790]: I1011 10:56:28.041841 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:28.191249 master-0 kubenswrapper[4790]: I1011 10:56:28.191180 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:56:28.191952 master-0 kubenswrapper[4790]: W1011 10:56:28.191909 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253852fc_de03_49f0_8e18_b3ccba3d4966.slice/crio-836f61a45327b518c64768eb8146728d82afb4f76df022ecfc8b1a40c35ad99e WatchSource:0}: Error finding container 836f61a45327b518c64768eb8146728d82afb4f76df022ecfc8b1a40c35ad99e: Status 404 returned error can't find the container with id 836f61a45327b518c64768eb8146728d82afb4f76df022ecfc8b1a40c35ad99e Oct 11 10:56:28.264239 master-0 kubenswrapper[4790]: I1011 10:56:28.264179 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:28.266190 master-0 kubenswrapper[4790]: W1011 10:56:28.266002 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edb0512_334f_4bfd_b297_cce29a7c510b.slice/crio-ce923dabd975e7c88686332f8fea7fed5cd13b9284d3e5e9dcd78a4cee6b26bf WatchSource:0}: Error finding container ce923dabd975e7c88686332f8fea7fed5cd13b9284d3e5e9dcd78a4cee6b26bf: Status 404 returned error can't find the container with id ce923dabd975e7c88686332f8fea7fed5cd13b9284d3e5e9dcd78a4cee6b26bf Oct 11 10:56:28.387813 master-0 kubenswrapper[4790]: I1011 10:56:28.387749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"0edb0512-334f-4bfd-b297-cce29a7c510b","Type":"ContainerStarted","Data":"ce923dabd975e7c88686332f8fea7fed5cd13b9284d3e5e9dcd78a4cee6b26bf"} Oct 11 10:56:28.392676 master-0 kubenswrapper[4790]: I1011 10:56:28.392632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"253852fc-de03-49f0-8e18-b3ccba3d4966","Type":"ContainerStarted","Data":"836f61a45327b518c64768eb8146728d82afb4f76df022ecfc8b1a40c35ad99e"} Oct 11 10:56:28.517783 master-0 kubenswrapper[4790]: W1011 10:56:28.517151 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1479006f_cac6_481e_86de_1ec1bed55c2d.slice/crio-c88496f1c62cf4144371d31cb98a180ae9bb8ab68b19a997dabce86192b784b4 WatchSource:0}: Error finding container c88496f1c62cf4144371d31cb98a180ae9bb8ab68b19a997dabce86192b784b4: Status 404 returned error can't find the container with id c88496f1c62cf4144371d31cb98a180ae9bb8ab68b19a997dabce86192b784b4 Oct 11 10:56:28.517783 master-0 kubenswrapper[4790]: I1011 10:56:28.517308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:29.421427 master-0 kubenswrapper[4790]: I1011 10:56:29.421296 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"1479006f-cac6-481e-86de-1ec1bed55c2d","Type":"ContainerStarted","Data":"c88496f1c62cf4144371d31cb98a180ae9bb8ab68b19a997dabce86192b784b4"} Oct 11 10:56:31.486079 master-0 kubenswrapper[4790]: I1011 10:56:31.485997 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:37.547635 master-0 kubenswrapper[4790]: I1011 10:56:37.547542 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"0edb0512-334f-4bfd-b297-cce29a7c510b","Type":"ContainerStarted","Data":"bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866"} Oct 11 10:56:37.551827 master-0 kubenswrapper[4790]: I1011 10:56:37.551788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"253852fc-de03-49f0-8e18-b3ccba3d4966","Type":"ContainerStarted","Data":"98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4"} Oct 11 10:56:37.561119 master-0 kubenswrapper[4790]: I1011 10:56:37.557189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"1479006f-cac6-481e-86de-1ec1bed55c2d","Type":"ContainerStarted","Data":"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b"} Oct 11 10:56:37.583694 master-0 kubenswrapper[4790]: I1011 10:56:37.582250 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-2" podStartSLOduration=1.631917761 podStartE2EDuration="10.582228737s" podCreationTimestamp="2025-10-11 10:56:27 +0000 UTC" firstStartedPulling="2025-10-11 10:56:28.269293578 +0000 UTC m=+1064.823753870" lastFinishedPulling="2025-10-11 10:56:37.219604554 +0000 UTC m=+1073.774064846" observedRunningTime="2025-10-11 10:56:37.579841361 +0000 UTC m=+1074.134301663" watchObservedRunningTime="2025-10-11 10:56:37.582228737 +0000 UTC m=+1074.136689029" Oct 11 10:56:37.589867 master-0 kubenswrapper[4790]: I1011 10:56:37.589506 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-2" Oct 11 10:56:37.589867 master-0 kubenswrapper[4790]: I1011 10:56:37.589816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-2" Oct 11 10:56:37.621366 master-0 kubenswrapper[4790]: I1011 10:56:37.621127 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-2" Oct 11 10:56:38.241421 master-0 kubenswrapper[4790]: I1011 10:56:38.241163 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768f954cfc-9xg22"] Oct 11 10:56:38.241718 master-0 kubenswrapper[4790]: I1011 10:56:38.241505 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerName="dnsmasq-dns" containerID="cri-o://585d3075d62338a2a9a42f35ce1fe0086b7e13eaad601eec48eba4bae373241a" gracePeriod=10 Oct 11 10:56:38.566279 master-0 kubenswrapper[4790]: I1011 10:56:38.566186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"253852fc-de03-49f0-8e18-b3ccba3d4966","Type":"ContainerStarted","Data":"b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83"} Oct 11 10:56:38.572090 master-0 kubenswrapper[4790]: I1011 10:56:38.572042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"1479006f-cac6-481e-86de-1ec1bed55c2d","Type":"ContainerStarted","Data":"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168"} Oct 11 10:56:38.572434 master-0 kubenswrapper[4790]: I1011 10:56:38.572188 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-2" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-log" containerID="cri-o://963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b" gracePeriod=30 Oct 11 10:56:38.572434 master-0 kubenswrapper[4790]: I1011 10:56:38.572427 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-2" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-metadata" containerID="cri-o://edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168" gracePeriod=30 Oct 11 10:56:38.576967 master-0 kubenswrapper[4790]: I1011 10:56:38.576644 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerID="585d3075d62338a2a9a42f35ce1fe0086b7e13eaad601eec48eba4bae373241a" exitCode=0 Oct 11 10:56:38.576967 master-0 kubenswrapper[4790]: I1011 10:56:38.576741 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" event={"ID":"6ff24705-c685-47d9-ad1b-9ec04c541bf7","Type":"ContainerDied","Data":"585d3075d62338a2a9a42f35ce1fe0086b7e13eaad601eec48eba4bae373241a"} Oct 11 10:56:38.629662 master-0 kubenswrapper[4790]: I1011 10:56:38.629526 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-2" podStartSLOduration=2.920112596 podStartE2EDuration="11.629501693s" podCreationTimestamp="2025-10-11 10:56:27 +0000 UTC" firstStartedPulling="2025-10-11 10:56:28.520188941 +0000 UTC m=+1065.074649233" lastFinishedPulling="2025-10-11 10:56:37.229578038 +0000 UTC m=+1073.784038330" observedRunningTime="2025-10-11 10:56:38.621564085 +0000 UTC m=+1075.176024377" watchObservedRunningTime="2025-10-11 10:56:38.629501693 +0000 UTC m=+1075.183961985" Oct 11 10:56:38.632166 master-0 kubenswrapper[4790]: I1011 10:56:38.632132 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1" podStartSLOduration=2.630115069 podStartE2EDuration="11.632127065s" podCreationTimestamp="2025-10-11 10:56:27 +0000 UTC" firstStartedPulling="2025-10-11 10:56:28.199329776 +0000 UTC m=+1064.753790068" lastFinishedPulling="2025-10-11 10:56:37.201341772 +0000 UTC m=+1073.755802064" observedRunningTime="2025-10-11 10:56:38.595615032 +0000 UTC m=+1075.150075344" watchObservedRunningTime="2025-10-11 10:56:38.632127065 +0000 UTC m=+1075.186587357" Oct 11 10:56:38.641016 master-0 kubenswrapper[4790]: I1011 10:56:38.640945 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-2" Oct 11 10:56:38.887512 master-0 kubenswrapper[4790]: I1011 10:56:38.887432 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:56:39.000859 master-0 kubenswrapper[4790]: I1011 10:56:39.000760 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-sb\") pod \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " Oct 11 10:56:39.002941 master-0 kubenswrapper[4790]: I1011 10:56:39.002878 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb9f4\" (UniqueName: \"kubernetes.io/projected/6ff24705-c685-47d9-ad1b-9ec04c541bf7-kube-api-access-hb9f4\") pod \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " Oct 11 10:56:39.003024 master-0 kubenswrapper[4790]: I1011 10:56:39.002982 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-svc\") pod \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " Oct 11 10:56:39.003201 master-0 kubenswrapper[4790]: I1011 10:56:39.003174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-swift-storage-0\") pod \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " Oct 11 10:56:39.003260 master-0 kubenswrapper[4790]: I1011 10:56:39.003244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-config\") pod \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " Oct 11 10:56:39.003739 master-0 kubenswrapper[4790]: I1011 10:56:39.003680 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-nb\") pod \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\" (UID: \"6ff24705-c685-47d9-ad1b-9ec04c541bf7\") " Oct 11 10:56:39.015603 master-0 kubenswrapper[4790]: I1011 10:56:39.015301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff24705-c685-47d9-ad1b-9ec04c541bf7-kube-api-access-hb9f4" (OuterVolumeSpecName: "kube-api-access-hb9f4") pod "6ff24705-c685-47d9-ad1b-9ec04c541bf7" (UID: "6ff24705-c685-47d9-ad1b-9ec04c541bf7"). InnerVolumeSpecName "kube-api-access-hb9f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:56:39.064420 master-0 kubenswrapper[4790]: I1011 10:56:39.064337 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ff24705-c685-47d9-ad1b-9ec04c541bf7" (UID: "6ff24705-c685-47d9-ad1b-9ec04c541bf7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:56:39.065384 master-0 kubenswrapper[4790]: I1011 10:56:39.065315 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ff24705-c685-47d9-ad1b-9ec04c541bf7" (UID: "6ff24705-c685-47d9-ad1b-9ec04c541bf7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:56:39.065533 master-0 kubenswrapper[4790]: I1011 10:56:39.065457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ff24705-c685-47d9-ad1b-9ec04c541bf7" (UID: "6ff24705-c685-47d9-ad1b-9ec04c541bf7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:56:39.077045 master-0 kubenswrapper[4790]: I1011 10:56:39.076896 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-config" (OuterVolumeSpecName: "config") pod "6ff24705-c685-47d9-ad1b-9ec04c541bf7" (UID: "6ff24705-c685-47d9-ad1b-9ec04c541bf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:56:39.092886 master-0 kubenswrapper[4790]: I1011 10:56:39.092800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ff24705-c685-47d9-ad1b-9ec04c541bf7" (UID: "6ff24705-c685-47d9-ad1b-9ec04c541bf7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 10:56:39.108058 master-0 kubenswrapper[4790]: I1011 10:56:39.107972 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.108058 master-0 kubenswrapper[4790]: I1011 10:56:39.108045 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.108058 master-0 kubenswrapper[4790]: I1011 10:56:39.108061 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.108058 master-0 kubenswrapper[4790]: I1011 10:56:39.108071 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.108354 master-0 kubenswrapper[4790]: I1011 10:56:39.108081 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb9f4\" (UniqueName: \"kubernetes.io/projected/6ff24705-c685-47d9-ad1b-9ec04c541bf7-kube-api-access-hb9f4\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.108354 master-0 kubenswrapper[4790]: I1011 10:56:39.108093 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ff24705-c685-47d9-ad1b-9ec04c541bf7-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.359835 master-0 kubenswrapper[4790]: I1011 10:56:39.358508 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:39.514569 master-0 kubenswrapper[4790]: I1011 10:56:39.514500 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1479006f-cac6-481e-86de-1ec1bed55c2d-logs\") pod \"1479006f-cac6-481e-86de-1ec1bed55c2d\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " Oct 11 10:56:39.514807 master-0 kubenswrapper[4790]: I1011 10:56:39.514749 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cktcw\" (UniqueName: \"kubernetes.io/projected/1479006f-cac6-481e-86de-1ec1bed55c2d-kube-api-access-cktcw\") pod \"1479006f-cac6-481e-86de-1ec1bed55c2d\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " Oct 11 10:56:39.514807 master-0 kubenswrapper[4790]: I1011 10:56:39.514787 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-config-data\") pod \"1479006f-cac6-481e-86de-1ec1bed55c2d\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " Oct 11 10:56:39.514868 master-0 kubenswrapper[4790]: I1011 10:56:39.514839 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-combined-ca-bundle\") pod \"1479006f-cac6-481e-86de-1ec1bed55c2d\" (UID: \"1479006f-cac6-481e-86de-1ec1bed55c2d\") " Oct 11 10:56:39.515170 master-0 kubenswrapper[4790]: I1011 10:56:39.515102 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1479006f-cac6-481e-86de-1ec1bed55c2d-logs" (OuterVolumeSpecName: "logs") pod "1479006f-cac6-481e-86de-1ec1bed55c2d" (UID: "1479006f-cac6-481e-86de-1ec1bed55c2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:56:39.518017 master-0 kubenswrapper[4790]: I1011 10:56:39.517946 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1479006f-cac6-481e-86de-1ec1bed55c2d-kube-api-access-cktcw" (OuterVolumeSpecName: "kube-api-access-cktcw") pod "1479006f-cac6-481e-86de-1ec1bed55c2d" (UID: "1479006f-cac6-481e-86de-1ec1bed55c2d"). InnerVolumeSpecName "kube-api-access-cktcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:56:39.538284 master-0 kubenswrapper[4790]: I1011 10:56:39.538196 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-config-data" (OuterVolumeSpecName: "config-data") pod "1479006f-cac6-481e-86de-1ec1bed55c2d" (UID: "1479006f-cac6-481e-86de-1ec1bed55c2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:39.554837 master-0 kubenswrapper[4790]: I1011 10:56:39.554755 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1479006f-cac6-481e-86de-1ec1bed55c2d" (UID: "1479006f-cac6-481e-86de-1ec1bed55c2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:39.588548 master-0 kubenswrapper[4790]: I1011 10:56:39.588450 4790 generic.go:334] "Generic (PLEG): container finished" podID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerID="edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168" exitCode=0 Oct 11 10:56:39.588548 master-0 kubenswrapper[4790]: I1011 10:56:39.588521 4790 generic.go:334] "Generic (PLEG): container finished" podID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerID="963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b" exitCode=143 Oct 11 10:56:39.589188 master-0 kubenswrapper[4790]: I1011 10:56:39.588554 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"1479006f-cac6-481e-86de-1ec1bed55c2d","Type":"ContainerDied","Data":"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168"} Oct 11 10:56:39.589188 master-0 kubenswrapper[4790]: I1011 10:56:39.588685 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"1479006f-cac6-481e-86de-1ec1bed55c2d","Type":"ContainerDied","Data":"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b"} Oct 11 10:56:39.589188 master-0 kubenswrapper[4790]: I1011 10:56:39.588701 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"1479006f-cac6-481e-86de-1ec1bed55c2d","Type":"ContainerDied","Data":"c88496f1c62cf4144371d31cb98a180ae9bb8ab68b19a997dabce86192b784b4"} Oct 11 10:56:39.589188 master-0 kubenswrapper[4790]: I1011 10:56:39.588748 4790 scope.go:117] "RemoveContainer" containerID="edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168" Oct 11 10:56:39.589188 master-0 kubenswrapper[4790]: I1011 10:56:39.588975 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:39.594560 master-0 kubenswrapper[4790]: I1011 10:56:39.594481 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" event={"ID":"6ff24705-c685-47d9-ad1b-9ec04c541bf7","Type":"ContainerDied","Data":"2e505acf6ba0dd723d6c70412053c327d464e773f7d7a12371848c3428f7bbf6"} Oct 11 10:56:39.594642 master-0 kubenswrapper[4790]: I1011 10:56:39.594592 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768f954cfc-9xg22" Oct 11 10:56:39.610500 master-0 kubenswrapper[4790]: I1011 10:56:39.609899 4790 scope.go:117] "RemoveContainer" containerID="963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b" Oct 11 10:56:39.617144 master-0 kubenswrapper[4790]: I1011 10:56:39.617078 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cktcw\" (UniqueName: \"kubernetes.io/projected/1479006f-cac6-481e-86de-1ec1bed55c2d-kube-api-access-cktcw\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.617144 master-0 kubenswrapper[4790]: I1011 10:56:39.617130 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.617144 master-0 kubenswrapper[4790]: I1011 10:56:39.617149 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1479006f-cac6-481e-86de-1ec1bed55c2d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.617347 master-0 kubenswrapper[4790]: I1011 10:56:39.617162 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1479006f-cac6-481e-86de-1ec1bed55c2d-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:39.644092 master-0 kubenswrapper[4790]: I1011 10:56:39.644024 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768f954cfc-9xg22"] Oct 11 10:56:39.645776 master-0 kubenswrapper[4790]: I1011 10:56:39.645696 4790 scope.go:117] "RemoveContainer" containerID="edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168" Oct 11 10:56:39.648212 master-0 kubenswrapper[4790]: E1011 10:56:39.648153 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168\": container with ID starting with edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168 not found: ID does not exist" containerID="edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168" Oct 11 10:56:39.648297 master-0 kubenswrapper[4790]: I1011 10:56:39.648215 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168"} err="failed to get container status \"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168\": rpc error: code = NotFound desc = could not find container \"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168\": container with ID starting with edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168 not found: ID does not exist" Oct 11 10:56:39.648297 master-0 kubenswrapper[4790]: I1011 10:56:39.648248 4790 scope.go:117] "RemoveContainer" containerID="963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b" Oct 11 10:56:39.648772 master-0 kubenswrapper[4790]: E1011 10:56:39.648738 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b\": container with ID starting with 963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b not found: ID does not exist" containerID="963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b" Oct 11 10:56:39.648772 master-0 kubenswrapper[4790]: I1011 10:56:39.648760 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b"} err="failed to get container status \"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b\": rpc error: code = NotFound desc = could not find container \"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b\": container with ID starting with 963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b not found: ID does not exist" Oct 11 10:56:39.648772 master-0 kubenswrapper[4790]: I1011 10:56:39.648774 4790 scope.go:117] "RemoveContainer" containerID="edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168" Oct 11 10:56:39.649108 master-0 kubenswrapper[4790]: I1011 10:56:39.649072 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168"} err="failed to get container status \"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168\": rpc error: code = NotFound desc = could not find container \"edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168\": container with ID starting with edbe8dca445656ff4e9df27c41b52fcad085e14ad25dc7ca18daaf1416a70168 not found: ID does not exist" Oct 11 10:56:39.649108 master-0 kubenswrapper[4790]: I1011 10:56:39.649096 4790 scope.go:117] "RemoveContainer" containerID="963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b" Oct 11 10:56:39.649378 master-0 kubenswrapper[4790]: I1011 10:56:39.649345 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b"} err="failed to get container status \"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b\": rpc error: code = NotFound desc = could not find container \"963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b\": container with ID starting with 963ab64307dc468bd5caa5c25c49620c1f8348df5ca7e4fa95e7a4a10d90a66b not found: ID does not exist" Oct 11 10:56:39.649378 master-0 kubenswrapper[4790]: I1011 10:56:39.649366 4790 scope.go:117] "RemoveContainer" containerID="585d3075d62338a2a9a42f35ce1fe0086b7e13eaad601eec48eba4bae373241a" Oct 11 10:56:39.649470 master-0 kubenswrapper[4790]: I1011 10:56:39.649418 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768f954cfc-9xg22"] Oct 11 10:56:39.671602 master-0 kubenswrapper[4790]: I1011 10:56:39.671540 4790 scope.go:117] "RemoveContainer" containerID="7210d87c28a292af798e5994b8f7c1185cbe0c9dd8ab3744872cfdcf6e01c602" Oct 11 10:56:39.674238 master-0 kubenswrapper[4790]: I1011 10:56:39.674192 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:39.683124 master-0 kubenswrapper[4790]: I1011 10:56:39.683055 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:39.713413 master-0 kubenswrapper[4790]: I1011 10:56:39.713087 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:39.713849 master-0 kubenswrapper[4790]: E1011 10:56:39.713807 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerName="init" Oct 11 10:56:39.713849 master-0 kubenswrapper[4790]: I1011 10:56:39.713838 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerName="init" Oct 11 10:56:39.713980 master-0 kubenswrapper[4790]: E1011 10:56:39.713870 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerName="dnsmasq-dns" Oct 11 10:56:39.713980 master-0 kubenswrapper[4790]: I1011 10:56:39.713879 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerName="dnsmasq-dns" Oct 11 10:56:39.713980 master-0 kubenswrapper[4790]: E1011 10:56:39.713892 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-log" Oct 11 10:56:39.713980 master-0 kubenswrapper[4790]: I1011 10:56:39.713900 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-log" Oct 11 10:56:39.713980 master-0 kubenswrapper[4790]: E1011 10:56:39.713929 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-metadata" Oct 11 10:56:39.713980 master-0 kubenswrapper[4790]: I1011 10:56:39.713935 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-metadata" Oct 11 10:56:39.714224 master-0 kubenswrapper[4790]: I1011 10:56:39.714115 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-log" Oct 11 10:56:39.714224 master-0 kubenswrapper[4790]: I1011 10:56:39.714154 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" containerName="nova-metadata-metadata" Oct 11 10:56:39.714224 master-0 kubenswrapper[4790]: I1011 10:56:39.714167 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" containerName="dnsmasq-dns" Oct 11 10:56:39.717499 master-0 kubenswrapper[4790]: I1011 10:56:39.717440 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:39.721427 master-0 kubenswrapper[4790]: I1011 10:56:39.721366 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 11 10:56:39.721682 master-0 kubenswrapper[4790]: I1011 10:56:39.721641 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 10:56:39.729477 master-0 kubenswrapper[4790]: I1011 10:56:39.729389 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:39.820474 master-0 kubenswrapper[4790]: I1011 10:56:39.820396 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-logs\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.820748 master-0 kubenswrapper[4790]: I1011 10:56:39.820502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ds6m\" (UniqueName: \"kubernetes.io/projected/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-kube-api-access-5ds6m\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.820748 master-0 kubenswrapper[4790]: I1011 10:56:39.820539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.820748 master-0 kubenswrapper[4790]: I1011 10:56:39.820570 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-config-data\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.820748 master-0 kubenswrapper[4790]: I1011 10:56:39.820612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.922663 master-0 kubenswrapper[4790]: I1011 10:56:39.922518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-logs\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.922663 master-0 kubenswrapper[4790]: I1011 10:56:39.922605 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ds6m\" (UniqueName: \"kubernetes.io/projected/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-kube-api-access-5ds6m\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.922663 master-0 kubenswrapper[4790]: I1011 10:56:39.922633 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.922663 master-0 kubenswrapper[4790]: I1011 10:56:39.922660 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-config-data\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.922991 master-0 kubenswrapper[4790]: I1011 10:56:39.922682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.924442 master-0 kubenswrapper[4790]: I1011 10:56:39.924398 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-logs\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.926952 master-0 kubenswrapper[4790]: I1011 10:56:39.926915 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.928186 master-0 kubenswrapper[4790]: I1011 10:56:39.928135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.928388 master-0 kubenswrapper[4790]: I1011 10:56:39.928361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-config-data\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:39.946661 master-0 kubenswrapper[4790]: I1011 10:56:39.946593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ds6m\" (UniqueName: \"kubernetes.io/projected/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-kube-api-access-5ds6m\") pod \"nova-metadata-2\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " pod="openstack/nova-metadata-2" Oct 11 10:56:40.040686 master-0 kubenswrapper[4790]: I1011 10:56:40.040607 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:40.304775 master-0 kubenswrapper[4790]: I1011 10:56:40.304661 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1479006f-cac6-481e-86de-1ec1bed55c2d" path="/var/lib/kubelet/pods/1479006f-cac6-481e-86de-1ec1bed55c2d/volumes" Oct 11 10:56:40.305418 master-0 kubenswrapper[4790]: I1011 10:56:40.305399 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff24705-c685-47d9-ad1b-9ec04c541bf7" path="/var/lib/kubelet/pods/6ff24705-c685-47d9-ad1b-9ec04c541bf7/volumes" Oct 11 10:56:40.493368 master-0 kubenswrapper[4790]: I1011 10:56:40.493302 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:40.617931 master-0 kubenswrapper[4790]: I1011 10:56:40.617834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb","Type":"ContainerStarted","Data":"b30caa9805cc96830370be2ccb0efbda932cb9d93ac0013c9544b523e620e980"} Oct 11 10:56:41.633511 master-0 kubenswrapper[4790]: I1011 10:56:41.633408 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb","Type":"ContainerStarted","Data":"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5"} Oct 11 10:56:41.633511 master-0 kubenswrapper[4790]: I1011 10:56:41.633488 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb","Type":"ContainerStarted","Data":"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec"} Oct 11 10:56:41.691066 master-0 kubenswrapper[4790]: I1011 10:56:41.690937 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-2" podStartSLOduration=2.690908991 podStartE2EDuration="2.690908991s" podCreationTimestamp="2025-10-11 10:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:56:41.683194949 +0000 UTC m=+1078.237655271" watchObservedRunningTime="2025-10-11 10:56:41.690908991 +0000 UTC m=+1078.245369293" Oct 11 10:56:42.274490 master-0 kubenswrapper[4790]: I1011 10:56:42.274437 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-748bbfcf89-9smpw" Oct 11 10:56:43.892770 master-0 kubenswrapper[4790]: I1011 10:56:43.892669 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7887b79bcd-vk5xz"] Oct 11 10:56:43.893427 master-0 kubenswrapper[4790]: I1011 10:56:43.893055 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7887b79bcd-vk5xz" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-api" containerID="cri-o://477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe" gracePeriod=30 Oct 11 10:56:43.893427 master-0 kubenswrapper[4790]: I1011 10:56:43.893136 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7887b79bcd-vk5xz" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-httpd" containerID="cri-o://2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e" gracePeriod=30 Oct 11 10:56:44.666985 master-0 kubenswrapper[4790]: I1011 10:56:44.666881 4790 generic.go:334] "Generic (PLEG): container finished" podID="7739fd2d-10b5-425d-acbf-f50630f07017" containerID="2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e" exitCode=0 Oct 11 10:56:44.667689 master-0 kubenswrapper[4790]: I1011 10:56:44.666979 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7887b79bcd-vk5xz" event={"ID":"7739fd2d-10b5-425d-acbf-f50630f07017","Type":"ContainerDied","Data":"2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e"} Oct 11 10:56:45.040790 master-0 kubenswrapper[4790]: I1011 10:56:45.040720 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-2" Oct 11 10:56:45.040790 master-0 kubenswrapper[4790]: I1011 10:56:45.040792 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-2" Oct 11 10:56:45.329491 master-0 kubenswrapper[4790]: I1011 10:56:45.329403 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:45.330995 master-0 kubenswrapper[4790]: I1011 10:56:45.329700 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-2" podUID="0edb0512-334f-4bfd-b297-cce29a7c510b" containerName="nova-scheduler-scheduler" containerID="cri-o://bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866" gracePeriod=30 Oct 11 10:56:45.429954 master-0 kubenswrapper[4790]: I1011 10:56:45.429879 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:45.672997 master-0 kubenswrapper[4790]: I1011 10:56:45.672857 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-2" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-log" containerID="cri-o://edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec" gracePeriod=30 Oct 11 10:56:45.673326 master-0 kubenswrapper[4790]: I1011 10:56:45.673005 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-2" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-metadata" containerID="cri-o://dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5" gracePeriod=30 Oct 11 10:56:46.327943 master-0 kubenswrapper[4790]: I1011 10:56:46.327876 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:46.387405 master-0 kubenswrapper[4790]: I1011 10:56:46.387321 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-combined-ca-bundle\") pod \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " Oct 11 10:56:46.387743 master-0 kubenswrapper[4790]: I1011 10:56:46.387433 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-logs\") pod \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " Oct 11 10:56:46.387743 master-0 kubenswrapper[4790]: I1011 10:56:46.387537 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-config-data\") pod \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " Oct 11 10:56:46.387743 master-0 kubenswrapper[4790]: I1011 10:56:46.387602 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-nova-metadata-tls-certs\") pod \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " Oct 11 10:56:46.387957 master-0 kubenswrapper[4790]: I1011 10:56:46.387852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ds6m\" (UniqueName: \"kubernetes.io/projected/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-kube-api-access-5ds6m\") pod \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\" (UID: \"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb\") " Oct 11 10:56:46.387957 master-0 kubenswrapper[4790]: I1011 10:56:46.387859 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-logs" (OuterVolumeSpecName: "logs") pod "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" (UID: "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:56:46.388850 master-0 kubenswrapper[4790]: I1011 10:56:46.388793 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:46.393011 master-0 kubenswrapper[4790]: I1011 10:56:46.392944 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-kube-api-access-5ds6m" (OuterVolumeSpecName: "kube-api-access-5ds6m") pod "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" (UID: "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb"). InnerVolumeSpecName "kube-api-access-5ds6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:56:46.405953 master-0 kubenswrapper[4790]: I1011 10:56:46.405869 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" (UID: "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:46.423210 master-0 kubenswrapper[4790]: I1011 10:56:46.423131 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-config-data" (OuterVolumeSpecName: "config-data") pod "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" (UID: "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:46.439363 master-0 kubenswrapper[4790]: I1011 10:56:46.439298 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" (UID: "aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:46.490931 master-0 kubenswrapper[4790]: I1011 10:56:46.490874 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ds6m\" (UniqueName: \"kubernetes.io/projected/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-kube-api-access-5ds6m\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:46.491452 master-0 kubenswrapper[4790]: I1011 10:56:46.491434 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:46.491563 master-0 kubenswrapper[4790]: I1011 10:56:46.491548 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:46.491657 master-0 kubenswrapper[4790]: I1011 10:56:46.491642 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:46.685536 master-0 kubenswrapper[4790]: I1011 10:56:46.685470 4790 generic.go:334] "Generic (PLEG): container finished" podID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerID="dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5" exitCode=0 Oct 11 10:56:46.685536 master-0 kubenswrapper[4790]: I1011 10:56:46.685520 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:46.685536 master-0 kubenswrapper[4790]: I1011 10:56:46.685531 4790 generic.go:334] "Generic (PLEG): container finished" podID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerID="edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec" exitCode=143 Oct 11 10:56:46.685536 master-0 kubenswrapper[4790]: I1011 10:56:46.685568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb","Type":"ContainerDied","Data":"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5"} Oct 11 10:56:46.686187 master-0 kubenswrapper[4790]: I1011 10:56:46.685612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb","Type":"ContainerDied","Data":"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec"} Oct 11 10:56:46.686187 master-0 kubenswrapper[4790]: I1011 10:56:46.685632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb","Type":"ContainerDied","Data":"b30caa9805cc96830370be2ccb0efbda932cb9d93ac0013c9544b523e620e980"} Oct 11 10:56:46.686187 master-0 kubenswrapper[4790]: I1011 10:56:46.685697 4790 scope.go:117] "RemoveContainer" containerID="dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5" Oct 11 10:56:46.721643 master-0 kubenswrapper[4790]: I1011 10:56:46.721590 4790 scope.go:117] "RemoveContainer" containerID="edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec" Oct 11 10:56:46.745065 master-0 kubenswrapper[4790]: I1011 10:56:46.745031 4790 scope.go:117] "RemoveContainer" containerID="dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5" Oct 11 10:56:46.745933 master-0 kubenswrapper[4790]: E1011 10:56:46.745894 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5\": container with ID starting with dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5 not found: ID does not exist" containerID="dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5" Oct 11 10:56:46.746066 master-0 kubenswrapper[4790]: I1011 10:56:46.745949 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5"} err="failed to get container status \"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5\": rpc error: code = NotFound desc = could not find container \"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5\": container with ID starting with dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5 not found: ID does not exist" Oct 11 10:56:46.746066 master-0 kubenswrapper[4790]: I1011 10:56:46.745980 4790 scope.go:117] "RemoveContainer" containerID="edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec" Oct 11 10:56:46.746463 master-0 kubenswrapper[4790]: E1011 10:56:46.746425 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec\": container with ID starting with edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec not found: ID does not exist" containerID="edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec" Oct 11 10:56:46.746583 master-0 kubenswrapper[4790]: I1011 10:56:46.746554 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec"} err="failed to get container status \"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec\": rpc error: code = NotFound desc = could not find container \"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec\": container with ID starting with edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec not found: ID does not exist" Oct 11 10:56:46.746623 master-0 kubenswrapper[4790]: I1011 10:56:46.746590 4790 scope.go:117] "RemoveContainer" containerID="dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5" Oct 11 10:56:46.747110 master-0 kubenswrapper[4790]: I1011 10:56:46.747079 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5"} err="failed to get container status \"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5\": rpc error: code = NotFound desc = could not find container \"dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5\": container with ID starting with dd353e02d9ecaea5cfa6a72e50b08a9d7d2734f7d93ee1f944d9a386c0f703b5 not found: ID does not exist" Oct 11 10:56:46.747165 master-0 kubenswrapper[4790]: I1011 10:56:46.747112 4790 scope.go:117] "RemoveContainer" containerID="edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec" Oct 11 10:56:46.747446 master-0 kubenswrapper[4790]: I1011 10:56:46.747425 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec"} err="failed to get container status \"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec\": rpc error: code = NotFound desc = could not find container \"edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec\": container with ID starting with edd9d5cc67a7fa520302dd1bce6ecd1726176be60b3256d0edfdffc2d8e7a1ec not found: ID does not exist" Oct 11 10:56:46.987830 master-0 kubenswrapper[4790]: I1011 10:56:46.987612 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:47.399763 master-0 kubenswrapper[4790]: I1011 10:56:47.398975 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:47.593200 master-0 kubenswrapper[4790]: E1011 10:56:47.593058 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 10:56:47.595758 master-0 kubenswrapper[4790]: E1011 10:56:47.595637 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 10:56:47.599747 master-0 kubenswrapper[4790]: E1011 10:56:47.599601 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 10:56:47.599866 master-0 kubenswrapper[4790]: E1011 10:56:47.599775 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-2" podUID="0edb0512-334f-4bfd-b297-cce29a7c510b" containerName="nova-scheduler-scheduler" Oct 11 10:56:47.605825 master-0 kubenswrapper[4790]: I1011 10:56:47.605745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Oct 11 10:56:47.606021 master-0 kubenswrapper[4790]: I1011 10:56:47.605838 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Oct 11 10:56:47.691841 master-0 kubenswrapper[4790]: I1011 10:56:47.691652 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:47.692104 master-0 kubenswrapper[4790]: E1011 10:56:47.692068 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-log" Oct 11 10:56:47.692104 master-0 kubenswrapper[4790]: I1011 10:56:47.692091 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-log" Oct 11 10:56:47.692210 master-0 kubenswrapper[4790]: E1011 10:56:47.692137 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-metadata" Oct 11 10:56:47.692210 master-0 kubenswrapper[4790]: I1011 10:56:47.692145 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-metadata" Oct 11 10:56:47.692340 master-0 kubenswrapper[4790]: I1011 10:56:47.692303 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-metadata" Oct 11 10:56:47.692340 master-0 kubenswrapper[4790]: I1011 10:56:47.692334 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" containerName="nova-metadata-log" Oct 11 10:56:47.693609 master-0 kubenswrapper[4790]: I1011 10:56:47.693574 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:47.696948 master-0 kubenswrapper[4790]: I1011 10:56:47.696893 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 11 10:56:47.697171 master-0 kubenswrapper[4790]: I1011 10:56:47.697146 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 10:56:47.889788 master-0 kubenswrapper[4790]: I1011 10:56:47.889691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ttc\" (UniqueName: \"kubernetes.io/projected/d221eb73-a42b-4c47-a912-4e47b88297a4-kube-api-access-z7ttc\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.890098 master-0 kubenswrapper[4790]: I1011 10:56:47.889832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d221eb73-a42b-4c47-a912-4e47b88297a4-logs\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.890098 master-0 kubenswrapper[4790]: I1011 10:56:47.889896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.890098 master-0 kubenswrapper[4790]: I1011 10:56:47.889947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-config-data\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.890098 master-0 kubenswrapper[4790]: I1011 10:56:47.889979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.991927 master-0 kubenswrapper[4790]: I1011 10:56:47.991760 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ttc\" (UniqueName: \"kubernetes.io/projected/d221eb73-a42b-4c47-a912-4e47b88297a4-kube-api-access-z7ttc\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.991927 master-0 kubenswrapper[4790]: I1011 10:56:47.991847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d221eb73-a42b-4c47-a912-4e47b88297a4-logs\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.991927 master-0 kubenswrapper[4790]: I1011 10:56:47.991870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.991927 master-0 kubenswrapper[4790]: I1011 10:56:47.991902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-config-data\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.991927 master-0 kubenswrapper[4790]: I1011 10:56:47.991941 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:47.992629 master-0 kubenswrapper[4790]: I1011 10:56:47.992574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d221eb73-a42b-4c47-a912-4e47b88297a4-logs\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:48.003097 master-0 kubenswrapper[4790]: I1011 10:56:48.003023 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-config-data\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:48.003241 master-0 kubenswrapper[4790]: I1011 10:56:48.003168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:48.004487 master-0 kubenswrapper[4790]: I1011 10:56:48.004428 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:48.301920 master-0 kubenswrapper[4790]: I1011 10:56:48.301850 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb" path="/var/lib/kubelet/pods/aca84ed9-2b87-42a5-a6f5-33ab7f2d25eb/volumes" Oct 11 10:56:48.412014 master-0 kubenswrapper[4790]: I1011 10:56:48.411937 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:48.614198 master-0 kubenswrapper[4790]: I1011 10:56:48.614052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ttc\" (UniqueName: \"kubernetes.io/projected/d221eb73-a42b-4c47-a912-4e47b88297a4-kube-api-access-z7ttc\") pod \"nova-metadata-2\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " pod="openstack/nova-metadata-2" Oct 11 10:56:48.688268 master-0 kubenswrapper[4790]: I1011 10:56:48.688153 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.130.0.112:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 10:56:48.688536 master-0 kubenswrapper[4790]: I1011 10:56:48.688243 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.130.0.112:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 10:56:48.910167 master-0 kubenswrapper[4790]: I1011 10:56:48.909966 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:56:49.526540 master-0 kubenswrapper[4790]: I1011 10:56:49.526468 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:56:49.716090 master-0 kubenswrapper[4790]: I1011 10:56:49.716036 4790 generic.go:334] "Generic (PLEG): container finished" podID="0edb0512-334f-4bfd-b297-cce29a7c510b" containerID="bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866" exitCode=0 Oct 11 10:56:49.716292 master-0 kubenswrapper[4790]: I1011 10:56:49.716128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"0edb0512-334f-4bfd-b297-cce29a7c510b","Type":"ContainerDied","Data":"bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866"} Oct 11 10:56:49.717889 master-0 kubenswrapper[4790]: I1011 10:56:49.717855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"d221eb73-a42b-4c47-a912-4e47b88297a4","Type":"ContainerStarted","Data":"a01fc17bb36e96804df4939bb484d9c50eb215a10fead9c510b32b80ed9bd4c0"} Oct 11 10:56:50.036736 master-0 kubenswrapper[4790]: I1011 10:56:50.035690 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:56:50.142317 master-0 kubenswrapper[4790]: I1011 10:56:50.141223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-config-data\") pod \"0edb0512-334f-4bfd-b297-cce29a7c510b\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " Oct 11 10:56:50.142317 master-0 kubenswrapper[4790]: I1011 10:56:50.141337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-combined-ca-bundle\") pod \"0edb0512-334f-4bfd-b297-cce29a7c510b\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " Oct 11 10:56:50.142317 master-0 kubenswrapper[4790]: I1011 10:56:50.141380 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6w4\" (UniqueName: \"kubernetes.io/projected/0edb0512-334f-4bfd-b297-cce29a7c510b-kube-api-access-9l6w4\") pod \"0edb0512-334f-4bfd-b297-cce29a7c510b\" (UID: \"0edb0512-334f-4bfd-b297-cce29a7c510b\") " Oct 11 10:56:50.147154 master-0 kubenswrapper[4790]: I1011 10:56:50.146602 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edb0512-334f-4bfd-b297-cce29a7c510b-kube-api-access-9l6w4" (OuterVolumeSpecName: "kube-api-access-9l6w4") pod "0edb0512-334f-4bfd-b297-cce29a7c510b" (UID: "0edb0512-334f-4bfd-b297-cce29a7c510b"). InnerVolumeSpecName "kube-api-access-9l6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:56:50.168254 master-0 kubenswrapper[4790]: I1011 10:56:50.168164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0edb0512-334f-4bfd-b297-cce29a7c510b" (UID: "0edb0512-334f-4bfd-b297-cce29a7c510b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:50.175717 master-0 kubenswrapper[4790]: I1011 10:56:50.175018 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-config-data" (OuterVolumeSpecName: "config-data") pod "0edb0512-334f-4bfd-b297-cce29a7c510b" (UID: "0edb0512-334f-4bfd-b297-cce29a7c510b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:50.243974 master-0 kubenswrapper[4790]: I1011 10:56:50.243675 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.243974 master-0 kubenswrapper[4790]: I1011 10:56:50.243740 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6w4\" (UniqueName: \"kubernetes.io/projected/0edb0512-334f-4bfd-b297-cce29a7c510b-kube-api-access-9l6w4\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.243974 master-0 kubenswrapper[4790]: I1011 10:56:50.243755 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edb0512-334f-4bfd-b297-cce29a7c510b-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.420512 master-0 kubenswrapper[4790]: I1011 10:56:50.420463 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:56:50.447750 master-0 kubenswrapper[4790]: I1011 10:56:50.446865 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-config\") pod \"7739fd2d-10b5-425d-acbf-f50630f07017\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " Oct 11 10:56:50.447750 master-0 kubenswrapper[4790]: I1011 10:56:50.446954 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f96jc\" (UniqueName: \"kubernetes.io/projected/7739fd2d-10b5-425d-acbf-f50630f07017-kube-api-access-f96jc\") pod \"7739fd2d-10b5-425d-acbf-f50630f07017\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " Oct 11 10:56:50.447750 master-0 kubenswrapper[4790]: I1011 10:56:50.447073 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-combined-ca-bundle\") pod \"7739fd2d-10b5-425d-acbf-f50630f07017\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " Oct 11 10:56:50.447750 master-0 kubenswrapper[4790]: I1011 10:56:50.447124 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-ovndb-tls-certs\") pod \"7739fd2d-10b5-425d-acbf-f50630f07017\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " Oct 11 10:56:50.447750 master-0 kubenswrapper[4790]: I1011 10:56:50.447171 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-httpd-config\") pod \"7739fd2d-10b5-425d-acbf-f50630f07017\" (UID: \"7739fd2d-10b5-425d-acbf-f50630f07017\") " Oct 11 10:56:50.457751 master-0 kubenswrapper[4790]: I1011 10:56:50.451848 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7739fd2d-10b5-425d-acbf-f50630f07017" (UID: "7739fd2d-10b5-425d-acbf-f50630f07017"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:50.457751 master-0 kubenswrapper[4790]: I1011 10:56:50.454423 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7739fd2d-10b5-425d-acbf-f50630f07017-kube-api-access-f96jc" (OuterVolumeSpecName: "kube-api-access-f96jc") pod "7739fd2d-10b5-425d-acbf-f50630f07017" (UID: "7739fd2d-10b5-425d-acbf-f50630f07017"). InnerVolumeSpecName "kube-api-access-f96jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:56:50.502877 master-0 kubenswrapper[4790]: I1011 10:56:50.498842 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-config" (OuterVolumeSpecName: "config") pod "7739fd2d-10b5-425d-acbf-f50630f07017" (UID: "7739fd2d-10b5-425d-acbf-f50630f07017"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:50.503358 master-0 kubenswrapper[4790]: I1011 10:56:50.503320 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7739fd2d-10b5-425d-acbf-f50630f07017" (UID: "7739fd2d-10b5-425d-acbf-f50630f07017"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:50.533726 master-0 kubenswrapper[4790]: I1011 10:56:50.533622 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7739fd2d-10b5-425d-acbf-f50630f07017" (UID: "7739fd2d-10b5-425d-acbf-f50630f07017"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:56:50.550066 master-0 kubenswrapper[4790]: I1011 10:56:50.550012 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f96jc\" (UniqueName: \"kubernetes.io/projected/7739fd2d-10b5-425d-acbf-f50630f07017-kube-api-access-f96jc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.550066 master-0 kubenswrapper[4790]: I1011 10:56:50.550067 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.550263 master-0 kubenswrapper[4790]: I1011 10:56:50.550082 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.550263 master-0 kubenswrapper[4790]: I1011 10:56:50.550095 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-httpd-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.550263 master-0 kubenswrapper[4790]: I1011 10:56:50.550110 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7739fd2d-10b5-425d-acbf-f50630f07017-config\") on node \"master-0\" DevicePath \"\"" Oct 11 10:56:50.727524 master-0 kubenswrapper[4790]: I1011 10:56:50.727449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"d221eb73-a42b-4c47-a912-4e47b88297a4","Type":"ContainerStarted","Data":"b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc"} Oct 11 10:56:50.727524 master-0 kubenswrapper[4790]: I1011 10:56:50.727522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"d221eb73-a42b-4c47-a912-4e47b88297a4","Type":"ContainerStarted","Data":"15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a"} Oct 11 10:56:50.731062 master-0 kubenswrapper[4790]: I1011 10:56:50.731002 4790 generic.go:334] "Generic (PLEG): container finished" podID="7739fd2d-10b5-425d-acbf-f50630f07017" containerID="477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe" exitCode=0 Oct 11 10:56:50.731152 master-0 kubenswrapper[4790]: I1011 10:56:50.731068 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7887b79bcd-vk5xz" event={"ID":"7739fd2d-10b5-425d-acbf-f50630f07017","Type":"ContainerDied","Data":"477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe"} Oct 11 10:56:50.731187 master-0 kubenswrapper[4790]: I1011 10:56:50.731132 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7887b79bcd-vk5xz" Oct 11 10:56:50.731187 master-0 kubenswrapper[4790]: I1011 10:56:50.731164 4790 scope.go:117] "RemoveContainer" containerID="2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e" Oct 11 10:56:50.731259 master-0 kubenswrapper[4790]: I1011 10:56:50.731145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7887b79bcd-vk5xz" event={"ID":"7739fd2d-10b5-425d-acbf-f50630f07017","Type":"ContainerDied","Data":"bff1b3163e12e8cb39a3fbc83d1e97e1c30281603df9047eeb4b5e8ea878efe5"} Oct 11 10:56:50.733496 master-0 kubenswrapper[4790]: I1011 10:56:50.733457 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"0edb0512-334f-4bfd-b297-cce29a7c510b","Type":"ContainerDied","Data":"ce923dabd975e7c88686332f8fea7fed5cd13b9284d3e5e9dcd78a4cee6b26bf"} Oct 11 10:56:50.733552 master-0 kubenswrapper[4790]: I1011 10:56:50.733511 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:56:50.762032 master-0 kubenswrapper[4790]: I1011 10:56:50.761905 4790 scope.go:117] "RemoveContainer" containerID="477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe" Oct 11 10:56:50.781973 master-0 kubenswrapper[4790]: I1011 10:56:50.781852 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-2" podStartSLOduration=3.781818001 podStartE2EDuration="3.781818001s" podCreationTimestamp="2025-10-11 10:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:56:50.773003529 +0000 UTC m=+1087.327463841" watchObservedRunningTime="2025-10-11 10:56:50.781818001 +0000 UTC m=+1087.336278303" Oct 11 10:56:50.793822 master-0 kubenswrapper[4790]: I1011 10:56:50.793769 4790 scope.go:117] "RemoveContainer" containerID="2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e" Oct 11 10:56:50.794419 master-0 kubenswrapper[4790]: E1011 10:56:50.794339 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e\": container with ID starting with 2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e not found: ID does not exist" containerID="2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e" Oct 11 10:56:50.794507 master-0 kubenswrapper[4790]: I1011 10:56:50.794415 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e"} err="failed to get container status \"2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e\": rpc error: code = NotFound desc = could not find container \"2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e\": container with ID starting with 2d24a46f82f260f086d4dccdce3d54977e12591932842b6bc27cb71f5af9d42e not found: ID does not exist" Oct 11 10:56:50.794507 master-0 kubenswrapper[4790]: I1011 10:56:50.794449 4790 scope.go:117] "RemoveContainer" containerID="477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe" Oct 11 10:56:50.799158 master-0 kubenswrapper[4790]: E1011 10:56:50.798863 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe\": container with ID starting with 477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe not found: ID does not exist" containerID="477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe" Oct 11 10:56:50.799158 master-0 kubenswrapper[4790]: I1011 10:56:50.798937 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe"} err="failed to get container status \"477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe\": rpc error: code = NotFound desc = could not find container \"477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe\": container with ID starting with 477999ad20086f7e774ba456600ecbd44f0948bdc686eaeda5ddb795290ac9fe not found: ID does not exist" Oct 11 10:56:50.799158 master-0 kubenswrapper[4790]: I1011 10:56:50.798975 4790 scope.go:117] "RemoveContainer" containerID="bcee916a93846dcbbb2d3286227b4256009e1e87a4a77cd08320f9d8ba675866" Oct 11 10:56:50.810200 master-0 kubenswrapper[4790]: I1011 10:56:50.810064 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7887b79bcd-vk5xz"] Oct 11 10:56:50.818211 master-0 kubenswrapper[4790]: I1011 10:56:50.818144 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7887b79bcd-vk5xz"] Oct 11 10:56:50.830395 master-0 kubenswrapper[4790]: I1011 10:56:50.830328 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:50.839935 master-0 kubenswrapper[4790]: I1011 10:56:50.839866 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:50.876966 master-0 kubenswrapper[4790]: I1011 10:56:50.876867 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:50.877548 master-0 kubenswrapper[4790]: E1011 10:56:50.877502 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edb0512-334f-4bfd-b297-cce29a7c510b" containerName="nova-scheduler-scheduler" Oct 11 10:56:50.877548 master-0 kubenswrapper[4790]: I1011 10:56:50.877531 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edb0512-334f-4bfd-b297-cce29a7c510b" containerName="nova-scheduler-scheduler" Oct 11 10:56:50.877748 master-0 kubenswrapper[4790]: E1011 10:56:50.877556 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-api" Oct 11 10:56:50.877748 master-0 kubenswrapper[4790]: I1011 10:56:50.877567 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-api" Oct 11 10:56:50.877748 master-0 kubenswrapper[4790]: E1011 10:56:50.877584 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-httpd" Oct 11 10:56:50.877748 master-0 kubenswrapper[4790]: I1011 10:56:50.877592 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-httpd" Oct 11 10:56:50.880859 master-0 kubenswrapper[4790]: I1011 10:56:50.879264 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-httpd" Oct 11 10:56:50.880859 master-0 kubenswrapper[4790]: I1011 10:56:50.879295 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edb0512-334f-4bfd-b297-cce29a7c510b" containerName="nova-scheduler-scheduler" Oct 11 10:56:50.880859 master-0 kubenswrapper[4790]: I1011 10:56:50.879305 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" containerName="neutron-api" Oct 11 10:56:50.880859 master-0 kubenswrapper[4790]: I1011 10:56:50.880192 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:56:50.884279 master-0 kubenswrapper[4790]: I1011 10:56:50.883462 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 11 10:56:50.887860 master-0 kubenswrapper[4790]: I1011 10:56:50.887787 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:50.961408 master-0 kubenswrapper[4790]: I1011 10:56:50.961324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2b6\" (UniqueName: \"kubernetes.io/projected/08f5fb34-a451-48f6-91f4-60d27bfd939c-kube-api-access-xj2b6\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:50.961408 master-0 kubenswrapper[4790]: I1011 10:56:50.961389 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-config-data\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:50.961739 master-0 kubenswrapper[4790]: I1011 10:56:50.961475 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.063477 master-0 kubenswrapper[4790]: I1011 10:56:51.063381 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.063773 master-0 kubenswrapper[4790]: I1011 10:56:51.063519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2b6\" (UniqueName: \"kubernetes.io/projected/08f5fb34-a451-48f6-91f4-60d27bfd939c-kube-api-access-xj2b6\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.063773 master-0 kubenswrapper[4790]: I1011 10:56:51.063554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-config-data\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.067850 master-0 kubenswrapper[4790]: I1011 10:56:51.067810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-config-data\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.070797 master-0 kubenswrapper[4790]: I1011 10:56:51.070243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.090590 master-0 kubenswrapper[4790]: I1011 10:56:51.090525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2b6\" (UniqueName: \"kubernetes.io/projected/08f5fb34-a451-48f6-91f4-60d27bfd939c-kube-api-access-xj2b6\") pod \"nova-scheduler-2\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " pod="openstack/nova-scheduler-2" Oct 11 10:56:51.215051 master-0 kubenswrapper[4790]: I1011 10:56:51.214952 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:56:51.658981 master-0 kubenswrapper[4790]: I1011 10:56:51.658920 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:56:51.744969 master-0 kubenswrapper[4790]: I1011 10:56:51.744905 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"08f5fb34-a451-48f6-91f4-60d27bfd939c","Type":"ContainerStarted","Data":"7550d6a37aff89c94d6dda17710e1becb7a0d5864e9949954fef2e9819f39291"} Oct 11 10:56:52.304256 master-0 kubenswrapper[4790]: I1011 10:56:52.304132 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edb0512-334f-4bfd-b297-cce29a7c510b" path="/var/lib/kubelet/pods/0edb0512-334f-4bfd-b297-cce29a7c510b/volumes" Oct 11 10:56:52.305075 master-0 kubenswrapper[4790]: I1011 10:56:52.305019 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7739fd2d-10b5-425d-acbf-f50630f07017" path="/var/lib/kubelet/pods/7739fd2d-10b5-425d-acbf-f50630f07017/volumes" Oct 11 10:56:52.776510 master-0 kubenswrapper[4790]: I1011 10:56:52.776425 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"08f5fb34-a451-48f6-91f4-60d27bfd939c","Type":"ContainerStarted","Data":"4e0c106293c88310a5257072d57742a9c018d67c1675782ea7da7a234c8ae82b"} Oct 11 10:56:52.810145 master-0 kubenswrapper[4790]: I1011 10:56:52.810055 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-2" podStartSLOduration=2.8100354899999997 podStartE2EDuration="2.81003549s" podCreationTimestamp="2025-10-11 10:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:56:52.80530884 +0000 UTC m=+1089.359769142" watchObservedRunningTime="2025-10-11 10:56:52.81003549 +0000 UTC m=+1089.364495782" Oct 11 10:56:53.910573 master-0 kubenswrapper[4790]: I1011 10:56:53.910487 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-2" Oct 11 10:56:53.911178 master-0 kubenswrapper[4790]: I1011 10:56:53.911042 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-2" Oct 11 10:56:56.215540 master-0 kubenswrapper[4790]: I1011 10:56:56.215455 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-2" Oct 11 10:56:57.604466 master-0 kubenswrapper[4790]: I1011 10:56:57.604354 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Oct 11 10:56:57.604466 master-0 kubenswrapper[4790]: I1011 10:56:57.604436 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Oct 11 10:56:57.611084 master-0 kubenswrapper[4790]: I1011 10:56:57.611001 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Oct 11 10:56:57.611311 master-0 kubenswrapper[4790]: I1011 10:56:57.611162 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Oct 11 10:56:57.829497 master-0 kubenswrapper[4790]: I1011 10:56:57.829436 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Oct 11 10:56:57.834609 master-0 kubenswrapper[4790]: I1011 10:56:57.834528 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Oct 11 10:56:58.910672 master-0 kubenswrapper[4790]: I1011 10:56:58.910573 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-2" Oct 11 10:56:58.910672 master-0 kubenswrapper[4790]: I1011 10:56:58.910662 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-2" Oct 11 10:56:59.939852 master-0 kubenswrapper[4790]: I1011 10:56:59.935941 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-2" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.130.0.115:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:56:59.939852 master-0 kubenswrapper[4790]: I1011 10:56:59.936760 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-2" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.130.0.115:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 10:57:01.216388 master-0 kubenswrapper[4790]: I1011 10:57:01.216168 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-2" Oct 11 10:57:01.247629 master-0 kubenswrapper[4790]: I1011 10:57:01.247560 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-2" Oct 11 10:57:01.889914 master-0 kubenswrapper[4790]: I1011 10:57:01.889837 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-2" Oct 11 10:57:08.919528 master-0 kubenswrapper[4790]: I1011 10:57:08.919443 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-2" Oct 11 10:57:08.925181 master-0 kubenswrapper[4790]: I1011 10:57:08.923392 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-2" Oct 11 10:57:08.931419 master-0 kubenswrapper[4790]: I1011 10:57:08.931229 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-2" Oct 11 10:57:08.969821 master-0 kubenswrapper[4790]: I1011 10:57:08.962806 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-2" Oct 11 10:57:10.139530 master-0 kubenswrapper[4790]: I1011 10:57:10.139457 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:10.140254 master-0 kubenswrapper[4790]: I1011 10:57:10.139798 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-1" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-log" containerID="cri-o://98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4" gracePeriod=30 Oct 11 10:57:10.140254 master-0 kubenswrapper[4790]: I1011 10:57:10.140010 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-1" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-api" containerID="cri-o://b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83" gracePeriod=30 Oct 11 10:57:10.370045 master-0 kubenswrapper[4790]: I1011 10:57:10.369967 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb9b8c955-g7qzg"] Oct 11 10:57:10.375793 master-0 kubenswrapper[4790]: I1011 10:57:10.372109 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.376646 master-0 kubenswrapper[4790]: I1011 10:57:10.376563 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 11 10:57:10.377957 master-0 kubenswrapper[4790]: I1011 10:57:10.377739 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 10:57:10.378113 master-0 kubenswrapper[4790]: I1011 10:57:10.377990 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 10:57:10.378248 master-0 kubenswrapper[4790]: I1011 10:57:10.378185 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 10:57:10.378555 master-0 kubenswrapper[4790]: I1011 10:57:10.378490 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 10:57:10.385924 master-0 kubenswrapper[4790]: I1011 10:57:10.385878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb9b8c955-g7qzg"] Oct 11 10:57:10.467943 master-0 kubenswrapper[4790]: I1011 10:57:10.467601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.467943 master-0 kubenswrapper[4790]: I1011 10:57:10.467674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srbk\" (UniqueName: \"kubernetes.io/projected/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-kube-api-access-4srbk\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.468234 master-0 kubenswrapper[4790]: I1011 10:57:10.467997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.468234 master-0 kubenswrapper[4790]: I1011 10:57:10.468086 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.468234 master-0 kubenswrapper[4790]: I1011 10:57:10.468121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-svc\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.468947 master-0 kubenswrapper[4790]: I1011 10:57:10.468510 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-config\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.571249 master-0 kubenswrapper[4790]: I1011 10:57:10.571147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.571548 master-0 kubenswrapper[4790]: I1011 10:57:10.571410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.571548 master-0 kubenswrapper[4790]: I1011 10:57:10.571450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-svc\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.571548 master-0 kubenswrapper[4790]: I1011 10:57:10.571545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-config\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.571770 master-0 kubenswrapper[4790]: I1011 10:57:10.571582 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srbk\" (UniqueName: \"kubernetes.io/projected/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-kube-api-access-4srbk\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.571770 master-0 kubenswrapper[4790]: I1011 10:57:10.571606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.573088 master-0 kubenswrapper[4790]: I1011 10:57:10.573026 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.573185 master-0 kubenswrapper[4790]: I1011 10:57:10.573145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.573286 master-0 kubenswrapper[4790]: I1011 10:57:10.573145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-svc\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.573394 master-0 kubenswrapper[4790]: I1011 10:57:10.573347 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-config\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.574441 master-0 kubenswrapper[4790]: I1011 10:57:10.574366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.595529 master-0 kubenswrapper[4790]: I1011 10:57:10.595190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srbk\" (UniqueName: \"kubernetes.io/projected/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-kube-api-access-4srbk\") pod \"dnsmasq-dns-6cb9b8c955-g7qzg\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.694842 master-0 kubenswrapper[4790]: I1011 10:57:10.694780 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:10.974928 master-0 kubenswrapper[4790]: I1011 10:57:10.974822 4790 generic.go:334] "Generic (PLEG): container finished" podID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerID="98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4" exitCode=143 Oct 11 10:57:10.975555 master-0 kubenswrapper[4790]: I1011 10:57:10.974933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"253852fc-de03-49f0-8e18-b3ccba3d4966","Type":"ContainerDied","Data":"98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4"} Oct 11 10:57:11.281455 master-0 kubenswrapper[4790]: I1011 10:57:11.281373 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb9b8c955-g7qzg"] Oct 11 10:57:11.988789 master-0 kubenswrapper[4790]: I1011 10:57:11.988718 4790 generic.go:334] "Generic (PLEG): container finished" podID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerID="148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079" exitCode=0 Oct 11 10:57:11.988789 master-0 kubenswrapper[4790]: I1011 10:57:11.988775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" event={"ID":"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c","Type":"ContainerDied","Data":"148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079"} Oct 11 10:57:11.989404 master-0 kubenswrapper[4790]: I1011 10:57:11.988812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" event={"ID":"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c","Type":"ContainerStarted","Data":"82ab564208fa75b6a416368edc3991b9aae0b1bdbf1f7ab61745c571e8067316"} Oct 11 10:57:13.000340 master-0 kubenswrapper[4790]: I1011 10:57:13.000273 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" event={"ID":"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c","Type":"ContainerStarted","Data":"0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c"} Oct 11 10:57:13.000931 master-0 kubenswrapper[4790]: I1011 10:57:13.000878 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:13.030945 master-0 kubenswrapper[4790]: I1011 10:57:13.030807 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" podStartSLOduration=3.030778655 podStartE2EDuration="3.030778655s" podCreationTimestamp="2025-10-11 10:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:57:13.026280931 +0000 UTC m=+1109.580741243" watchObservedRunningTime="2025-10-11 10:57:13.030778655 +0000 UTC m=+1109.585238947" Oct 11 10:57:13.849834 master-0 kubenswrapper[4790]: I1011 10:57:13.849651 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:13.954791 master-0 kubenswrapper[4790]: I1011 10:57:13.954432 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-config-data\") pod \"253852fc-de03-49f0-8e18-b3ccba3d4966\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " Oct 11 10:57:13.954791 master-0 kubenswrapper[4790]: I1011 10:57:13.954559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhcj7\" (UniqueName: \"kubernetes.io/projected/253852fc-de03-49f0-8e18-b3ccba3d4966-kube-api-access-jhcj7\") pod \"253852fc-de03-49f0-8e18-b3ccba3d4966\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " Oct 11 10:57:13.954791 master-0 kubenswrapper[4790]: I1011 10:57:13.954603 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-combined-ca-bundle\") pod \"253852fc-de03-49f0-8e18-b3ccba3d4966\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " Oct 11 10:57:13.954791 master-0 kubenswrapper[4790]: I1011 10:57:13.954696 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253852fc-de03-49f0-8e18-b3ccba3d4966-logs\") pod \"253852fc-de03-49f0-8e18-b3ccba3d4966\" (UID: \"253852fc-de03-49f0-8e18-b3ccba3d4966\") " Oct 11 10:57:13.958959 master-0 kubenswrapper[4790]: I1011 10:57:13.958911 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253852fc-de03-49f0-8e18-b3ccba3d4966-kube-api-access-jhcj7" (OuterVolumeSpecName: "kube-api-access-jhcj7") pod "253852fc-de03-49f0-8e18-b3ccba3d4966" (UID: "253852fc-de03-49f0-8e18-b3ccba3d4966"). InnerVolumeSpecName "kube-api-access-jhcj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:57:13.965087 master-0 kubenswrapper[4790]: I1011 10:57:13.965029 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253852fc-de03-49f0-8e18-b3ccba3d4966-logs" (OuterVolumeSpecName: "logs") pod "253852fc-de03-49f0-8e18-b3ccba3d4966" (UID: "253852fc-de03-49f0-8e18-b3ccba3d4966"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:57:13.985252 master-0 kubenswrapper[4790]: I1011 10:57:13.985175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "253852fc-de03-49f0-8e18-b3ccba3d4966" (UID: "253852fc-de03-49f0-8e18-b3ccba3d4966"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:13.986460 master-0 kubenswrapper[4790]: I1011 10:57:13.986368 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-config-data" (OuterVolumeSpecName: "config-data") pod "253852fc-de03-49f0-8e18-b3ccba3d4966" (UID: "253852fc-de03-49f0-8e18-b3ccba3d4966"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:14.015639 master-0 kubenswrapper[4790]: I1011 10:57:14.015575 4790 generic.go:334] "Generic (PLEG): container finished" podID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerID="b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83" exitCode=0 Oct 11 10:57:14.016589 master-0 kubenswrapper[4790]: I1011 10:57:14.015631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"253852fc-de03-49f0-8e18-b3ccba3d4966","Type":"ContainerDied","Data":"b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83"} Oct 11 10:57:14.016589 master-0 kubenswrapper[4790]: I1011 10:57:14.015673 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:14.016589 master-0 kubenswrapper[4790]: I1011 10:57:14.015728 4790 scope.go:117] "RemoveContainer" containerID="b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83" Oct 11 10:57:14.016589 master-0 kubenswrapper[4790]: I1011 10:57:14.015692 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"253852fc-de03-49f0-8e18-b3ccba3d4966","Type":"ContainerDied","Data":"836f61a45327b518c64768eb8146728d82afb4f76df022ecfc8b1a40c35ad99e"} Oct 11 10:57:14.048459 master-0 kubenswrapper[4790]: I1011 10:57:14.048400 4790 scope.go:117] "RemoveContainer" containerID="98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4" Oct 11 10:57:14.057207 master-0 kubenswrapper[4790]: I1011 10:57:14.056805 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:14.057207 master-0 kubenswrapper[4790]: I1011 10:57:14.056856 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253852fc-de03-49f0-8e18-b3ccba3d4966-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:14.057207 master-0 kubenswrapper[4790]: I1011 10:57:14.056871 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253852fc-de03-49f0-8e18-b3ccba3d4966-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:14.057207 master-0 kubenswrapper[4790]: I1011 10:57:14.056881 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhcj7\" (UniqueName: \"kubernetes.io/projected/253852fc-de03-49f0-8e18-b3ccba3d4966-kube-api-access-jhcj7\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:14.082123 master-0 kubenswrapper[4790]: I1011 10:57:14.082073 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:14.082518 master-0 kubenswrapper[4790]: I1011 10:57:14.082373 4790 scope.go:117] "RemoveContainer" containerID="b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83" Oct 11 10:57:14.083646 master-0 kubenswrapper[4790]: E1011 10:57:14.083602 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83\": container with ID starting with b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83 not found: ID does not exist" containerID="b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83" Oct 11 10:57:14.083764 master-0 kubenswrapper[4790]: I1011 10:57:14.083670 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83"} err="failed to get container status \"b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83\": rpc error: code = NotFound desc = could not find container \"b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83\": container with ID starting with b5982ea4f03ba0bc234b6f62a88bd15cfad78e0ab12b1e1ee49058903171de83 not found: ID does not exist" Oct 11 10:57:14.083807 master-0 kubenswrapper[4790]: I1011 10:57:14.083782 4790 scope.go:117] "RemoveContainer" containerID="98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4" Oct 11 10:57:14.084607 master-0 kubenswrapper[4790]: E1011 10:57:14.084545 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4\": container with ID starting with 98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4 not found: ID does not exist" containerID="98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4" Oct 11 10:57:14.084660 master-0 kubenswrapper[4790]: I1011 10:57:14.084611 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4"} err="failed to get container status \"98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4\": rpc error: code = NotFound desc = could not find container \"98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4\": container with ID starting with 98d215d30f8c7dc0e7fbe10e4417ac3c671b823010a7e36669d3ba73c51d79e4 not found: ID does not exist" Oct 11 10:57:14.088116 master-0 kubenswrapper[4790]: I1011 10:57:14.088068 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: I1011 10:57:14.119667 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: E1011 10:57:14.120140 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-log" Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: I1011 10:57:14.120160 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-log" Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: E1011 10:57:14.120186 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-api" Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: I1011 10:57:14.120194 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-api" Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: I1011 10:57:14.120384 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-api" Oct 11 10:57:14.120799 master-0 kubenswrapper[4790]: I1011 10:57:14.120411 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" containerName="nova-api-log" Oct 11 10:57:14.121502 master-0 kubenswrapper[4790]: I1011 10:57:14.121481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:14.131562 master-0 kubenswrapper[4790]: I1011 10:57:14.129837 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 11 10:57:14.134889 master-0 kubenswrapper[4790]: I1011 10:57:14.134837 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 11 10:57:14.141068 master-0 kubenswrapper[4790]: I1011 10:57:14.140975 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 10:57:14.164931 master-0 kubenswrapper[4790]: I1011 10:57:14.164876 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:14.262120 master-0 kubenswrapper[4790]: I1011 10:57:14.262032 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-public-tls-certs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.262260 master-0 kubenswrapper[4790]: I1011 10:57:14.262198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bcb391-53f2-438c-b46e-1f3208011f01-logs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.262260 master-0 kubenswrapper[4790]: I1011 10:57:14.262239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.262345 master-0 kubenswrapper[4790]: I1011 10:57:14.262271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-internal-tls-certs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.262345 master-0 kubenswrapper[4790]: I1011 10:57:14.262303 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qlt\" (UniqueName: \"kubernetes.io/projected/44bcb391-53f2-438c-b46e-1f3208011f01-kube-api-access-h8qlt\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.262435 master-0 kubenswrapper[4790]: I1011 10:57:14.262380 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-config-data\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.312442 master-0 kubenswrapper[4790]: I1011 10:57:14.312370 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253852fc-de03-49f0-8e18-b3ccba3d4966" path="/var/lib/kubelet/pods/253852fc-de03-49f0-8e18-b3ccba3d4966/volumes" Oct 11 10:57:14.364456 master-0 kubenswrapper[4790]: I1011 10:57:14.364387 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-public-tls-certs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.364665 master-0 kubenswrapper[4790]: I1011 10:57:14.364515 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bcb391-53f2-438c-b46e-1f3208011f01-logs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.364665 master-0 kubenswrapper[4790]: I1011 10:57:14.364582 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.364665 master-0 kubenswrapper[4790]: I1011 10:57:14.364621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-internal-tls-certs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.364665 master-0 kubenswrapper[4790]: I1011 10:57:14.364655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qlt\" (UniqueName: \"kubernetes.io/projected/44bcb391-53f2-438c-b46e-1f3208011f01-kube-api-access-h8qlt\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.364841 master-0 kubenswrapper[4790]: I1011 10:57:14.364685 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-config-data\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.365342 master-0 kubenswrapper[4790]: I1011 10:57:14.365276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bcb391-53f2-438c-b46e-1f3208011f01-logs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.368079 master-0 kubenswrapper[4790]: I1011 10:57:14.368024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-public-tls-certs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.368327 master-0 kubenswrapper[4790]: I1011 10:57:14.368301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.368435 master-0 kubenswrapper[4790]: I1011 10:57:14.368402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-config-data\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.370915 master-0 kubenswrapper[4790]: I1011 10:57:14.370812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-internal-tls-certs\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.402354 master-0 kubenswrapper[4790]: I1011 10:57:14.402275 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qlt\" (UniqueName: \"kubernetes.io/projected/44bcb391-53f2-438c-b46e-1f3208011f01-kube-api-access-h8qlt\") pod \"nova-api-1\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " pod="openstack/nova-api-1" Oct 11 10:57:14.471506 master-0 kubenswrapper[4790]: I1011 10:57:14.471421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:14.951604 master-0 kubenswrapper[4790]: I1011 10:57:14.951559 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:15.027147 master-0 kubenswrapper[4790]: I1011 10:57:15.027058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"44bcb391-53f2-438c-b46e-1f3208011f01","Type":"ContainerStarted","Data":"94a5d59118af400f9a8aa989b9def37c55ec9402d3102f10a1ba404bedd55ff9"} Oct 11 10:57:16.041635 master-0 kubenswrapper[4790]: I1011 10:57:16.041449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"44bcb391-53f2-438c-b46e-1f3208011f01","Type":"ContainerStarted","Data":"9c19ce238ad447003ee3f2fc9c2488f5e38167272b8ba8dbab542ad52ae186e6"} Oct 11 10:57:16.041635 master-0 kubenswrapper[4790]: I1011 10:57:16.041531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"44bcb391-53f2-438c-b46e-1f3208011f01","Type":"ContainerStarted","Data":"7c441b122cde11d0c0dd59e93d58166fba491455c97757aa05c250678bb0d192"} Oct 11 10:57:16.078436 master-0 kubenswrapper[4790]: I1011 10:57:16.078324 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1" podStartSLOduration=2.078300063 podStartE2EDuration="2.078300063s" podCreationTimestamp="2025-10-11 10:57:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:57:16.072995097 +0000 UTC m=+1112.627455389" watchObservedRunningTime="2025-10-11 10:57:16.078300063 +0000 UTC m=+1112.632760355" Oct 11 10:57:20.698085 master-0 kubenswrapper[4790]: I1011 10:57:20.698012 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 10:57:24.472322 master-0 kubenswrapper[4790]: I1011 10:57:24.472207 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Oct 11 10:57:24.472322 master-0 kubenswrapper[4790]: I1011 10:57:24.472304 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Oct 11 10:57:25.493351 master-0 kubenswrapper[4790]: I1011 10:57:25.492916 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.130.0.118:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:57:25.493351 master-0 kubenswrapper[4790]: I1011 10:57:25.492916 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.130.0.118:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:57:27.323764 master-0 kubenswrapper[4790]: I1011 10:57:27.323621 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:57:27.324499 master-0 kubenswrapper[4790]: I1011 10:57:27.323964 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-2" podUID="08f5fb34-a451-48f6-91f4-60d27bfd939c" containerName="nova-scheduler-scheduler" containerID="cri-o://4e0c106293c88310a5257072d57742a9c018d67c1675782ea7da7a234c8ae82b" gracePeriod=30 Oct 11 10:57:27.398300 master-0 kubenswrapper[4790]: I1011 10:57:27.398200 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:57:27.398863 master-0 kubenswrapper[4790]: I1011 10:57:27.398786 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-2" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-metadata" containerID="cri-o://b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc" gracePeriod=30 Oct 11 10:57:27.399074 master-0 kubenswrapper[4790]: I1011 10:57:27.399038 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-2" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-log" containerID="cri-o://15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a" gracePeriod=30 Oct 11 10:57:28.186811 master-0 kubenswrapper[4790]: I1011 10:57:28.186677 4790 generic.go:334] "Generic (PLEG): container finished" podID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerID="15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a" exitCode=143 Oct 11 10:57:28.186811 master-0 kubenswrapper[4790]: I1011 10:57:28.186765 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"d221eb73-a42b-4c47-a912-4e47b88297a4","Type":"ContainerDied","Data":"15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a"} Oct 11 10:57:30.207197 master-0 kubenswrapper[4790]: I1011 10:57:30.206976 4790 generic.go:334] "Generic (PLEG): container finished" podID="08f5fb34-a451-48f6-91f4-60d27bfd939c" containerID="4e0c106293c88310a5257072d57742a9c018d67c1675782ea7da7a234c8ae82b" exitCode=0 Oct 11 10:57:30.207197 master-0 kubenswrapper[4790]: I1011 10:57:30.207056 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"08f5fb34-a451-48f6-91f4-60d27bfd939c","Type":"ContainerDied","Data":"4e0c106293c88310a5257072d57742a9c018d67c1675782ea7da7a234c8ae82b"} Oct 11 10:57:30.494552 master-0 kubenswrapper[4790]: I1011 10:57:30.494501 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:57:30.539097 master-0 kubenswrapper[4790]: I1011 10:57:30.535001 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-2" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.130.0.115:8775/\": read tcp 10.130.0.2:36562->10.130.0.115:8775: read: connection reset by peer" Oct 11 10:57:30.539693 master-0 kubenswrapper[4790]: I1011 10:57:30.539580 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-2" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.130.0.115:8775/\": read tcp 10.130.0.2:36556->10.130.0.115:8775: read: connection reset by peer" Oct 11 10:57:30.602918 master-0 kubenswrapper[4790]: I1011 10:57:30.602827 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-config-data\") pod \"08f5fb34-a451-48f6-91f4-60d27bfd939c\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " Oct 11 10:57:30.603173 master-0 kubenswrapper[4790]: I1011 10:57:30.602936 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2b6\" (UniqueName: \"kubernetes.io/projected/08f5fb34-a451-48f6-91f4-60d27bfd939c-kube-api-access-xj2b6\") pod \"08f5fb34-a451-48f6-91f4-60d27bfd939c\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " Oct 11 10:57:30.603173 master-0 kubenswrapper[4790]: I1011 10:57:30.603088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-combined-ca-bundle\") pod \"08f5fb34-a451-48f6-91f4-60d27bfd939c\" (UID: \"08f5fb34-a451-48f6-91f4-60d27bfd939c\") " Oct 11 10:57:30.607801 master-0 kubenswrapper[4790]: I1011 10:57:30.607696 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f5fb34-a451-48f6-91f4-60d27bfd939c-kube-api-access-xj2b6" (OuterVolumeSpecName: "kube-api-access-xj2b6") pod "08f5fb34-a451-48f6-91f4-60d27bfd939c" (UID: "08f5fb34-a451-48f6-91f4-60d27bfd939c"). InnerVolumeSpecName "kube-api-access-xj2b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:57:30.637167 master-0 kubenswrapper[4790]: I1011 10:57:30.636803 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-config-data" (OuterVolumeSpecName: "config-data") pod "08f5fb34-a451-48f6-91f4-60d27bfd939c" (UID: "08f5fb34-a451-48f6-91f4-60d27bfd939c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:30.638338 master-0 kubenswrapper[4790]: I1011 10:57:30.638284 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08f5fb34-a451-48f6-91f4-60d27bfd939c" (UID: "08f5fb34-a451-48f6-91f4-60d27bfd939c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:30.706037 master-0 kubenswrapper[4790]: I1011 10:57:30.705968 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:30.706037 master-0 kubenswrapper[4790]: I1011 10:57:30.706014 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f5fb34-a451-48f6-91f4-60d27bfd939c-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:30.706037 master-0 kubenswrapper[4790]: I1011 10:57:30.706026 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj2b6\" (UniqueName: \"kubernetes.io/projected/08f5fb34-a451-48f6-91f4-60d27bfd939c-kube-api-access-xj2b6\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:31.100758 master-0 kubenswrapper[4790]: I1011 10:57:31.098961 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:57:31.218251 master-0 kubenswrapper[4790]: I1011 10:57:31.218183 4790 generic.go:334] "Generic (PLEG): container finished" podID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerID="b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc" exitCode=0 Oct 11 10:57:31.218856 master-0 kubenswrapper[4790]: I1011 10:57:31.218275 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"d221eb73-a42b-4c47-a912-4e47b88297a4","Type":"ContainerDied","Data":"b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc"} Oct 11 10:57:31.218856 master-0 kubenswrapper[4790]: I1011 10:57:31.218310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"d221eb73-a42b-4c47-a912-4e47b88297a4","Type":"ContainerDied","Data":"a01fc17bb36e96804df4939bb484d9c50eb215a10fead9c510b32b80ed9bd4c0"} Oct 11 10:57:31.218856 master-0 kubenswrapper[4790]: I1011 10:57:31.218332 4790 scope.go:117] "RemoveContainer" containerID="b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc" Oct 11 10:57:31.218856 master-0 kubenswrapper[4790]: I1011 10:57:31.218488 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:57:31.220001 master-0 kubenswrapper[4790]: I1011 10:57:31.219975 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d221eb73-a42b-4c47-a912-4e47b88297a4-logs\") pod \"d221eb73-a42b-4c47-a912-4e47b88297a4\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " Oct 11 10:57:31.220065 master-0 kubenswrapper[4790]: I1011 10:57:31.220022 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-combined-ca-bundle\") pod \"d221eb73-a42b-4c47-a912-4e47b88297a4\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " Oct 11 10:57:31.220106 master-0 kubenswrapper[4790]: I1011 10:57:31.220081 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-config-data\") pod \"d221eb73-a42b-4c47-a912-4e47b88297a4\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " Oct 11 10:57:31.220169 master-0 kubenswrapper[4790]: I1011 10:57:31.220151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7ttc\" (UniqueName: \"kubernetes.io/projected/d221eb73-a42b-4c47-a912-4e47b88297a4-kube-api-access-z7ttc\") pod \"d221eb73-a42b-4c47-a912-4e47b88297a4\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " Oct 11 10:57:31.220218 master-0 kubenswrapper[4790]: I1011 10:57:31.220205 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-nova-metadata-tls-certs\") pod \"d221eb73-a42b-4c47-a912-4e47b88297a4\" (UID: \"d221eb73-a42b-4c47-a912-4e47b88297a4\") " Oct 11 10:57:31.223393 master-0 kubenswrapper[4790]: I1011 10:57:31.222930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d221eb73-a42b-4c47-a912-4e47b88297a4-logs" (OuterVolumeSpecName: "logs") pod "d221eb73-a42b-4c47-a912-4e47b88297a4" (UID: "d221eb73-a42b-4c47-a912-4e47b88297a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:57:31.225686 master-0 kubenswrapper[4790]: I1011 10:57:31.225622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"08f5fb34-a451-48f6-91f4-60d27bfd939c","Type":"ContainerDied","Data":"7550d6a37aff89c94d6dda17710e1becb7a0d5864e9949954fef2e9819f39291"} Oct 11 10:57:31.225796 master-0 kubenswrapper[4790]: I1011 10:57:31.225688 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:57:31.231182 master-0 kubenswrapper[4790]: I1011 10:57:31.231147 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d221eb73-a42b-4c47-a912-4e47b88297a4-kube-api-access-z7ttc" (OuterVolumeSpecName: "kube-api-access-z7ttc") pod "d221eb73-a42b-4c47-a912-4e47b88297a4" (UID: "d221eb73-a42b-4c47-a912-4e47b88297a4"). InnerVolumeSpecName "kube-api-access-z7ttc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:57:31.242741 master-0 kubenswrapper[4790]: I1011 10:57:31.242661 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d221eb73-a42b-4c47-a912-4e47b88297a4" (UID: "d221eb73-a42b-4c47-a912-4e47b88297a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:31.247688 master-0 kubenswrapper[4790]: I1011 10:57:31.247656 4790 scope.go:117] "RemoveContainer" containerID="15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a" Oct 11 10:57:31.256306 master-0 kubenswrapper[4790]: I1011 10:57:31.256261 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-config-data" (OuterVolumeSpecName: "config-data") pod "d221eb73-a42b-4c47-a912-4e47b88297a4" (UID: "d221eb73-a42b-4c47-a912-4e47b88297a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:31.271698 master-0 kubenswrapper[4790]: I1011 10:57:31.271649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d221eb73-a42b-4c47-a912-4e47b88297a4" (UID: "d221eb73-a42b-4c47-a912-4e47b88297a4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:31.323087 master-0 kubenswrapper[4790]: I1011 10:57:31.322792 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d221eb73-a42b-4c47-a912-4e47b88297a4-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:31.323087 master-0 kubenswrapper[4790]: I1011 10:57:31.322865 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:31.323087 master-0 kubenswrapper[4790]: I1011 10:57:31.322886 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:31.323087 master-0 kubenswrapper[4790]: I1011 10:57:31.322905 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7ttc\" (UniqueName: \"kubernetes.io/projected/d221eb73-a42b-4c47-a912-4e47b88297a4-kube-api-access-z7ttc\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:31.323087 master-0 kubenswrapper[4790]: I1011 10:57:31.322926 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d221eb73-a42b-4c47-a912-4e47b88297a4-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:31.331151 master-0 kubenswrapper[4790]: I1011 10:57:31.330320 4790 scope.go:117] "RemoveContainer" containerID="b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc" Oct 11 10:57:31.334167 master-0 kubenswrapper[4790]: E1011 10:57:31.333835 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc\": container with ID starting with b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc not found: ID does not exist" containerID="b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc" Oct 11 10:57:31.334167 master-0 kubenswrapper[4790]: I1011 10:57:31.334022 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc"} err="failed to get container status \"b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc\": rpc error: code = NotFound desc = could not find container \"b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc\": container with ID starting with b47f7b57ab8ce5d9b1935a495e120a3eee5b2f462ead1f3d6820c61b709a1bbc not found: ID does not exist" Oct 11 10:57:31.334167 master-0 kubenswrapper[4790]: I1011 10:57:31.334106 4790 scope.go:117] "RemoveContainer" containerID="15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a" Oct 11 10:57:31.335426 master-0 kubenswrapper[4790]: E1011 10:57:31.335361 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a\": container with ID starting with 15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a not found: ID does not exist" containerID="15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a" Oct 11 10:57:31.335497 master-0 kubenswrapper[4790]: I1011 10:57:31.335425 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a"} err="failed to get container status \"15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a\": rpc error: code = NotFound desc = could not find container \"15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a\": container with ID starting with 15ed041bdb39ede94c3334a41421fc333a2760be59ed774b90e256cc011bfd8a not found: ID does not exist" Oct 11 10:57:31.335497 master-0 kubenswrapper[4790]: I1011 10:57:31.335448 4790 scope.go:117] "RemoveContainer" containerID="4e0c106293c88310a5257072d57742a9c018d67c1675782ea7da7a234c8ae82b" Oct 11 10:57:31.358160 master-0 kubenswrapper[4790]: I1011 10:57:31.358069 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:57:31.373887 master-0 kubenswrapper[4790]: I1011 10:57:31.372333 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:57:31.395462 master-0 kubenswrapper[4790]: I1011 10:57:31.395360 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:57:31.396317 master-0 kubenswrapper[4790]: E1011 10:57:31.395988 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f5fb34-a451-48f6-91f4-60d27bfd939c" containerName="nova-scheduler-scheduler" Oct 11 10:57:31.396317 master-0 kubenswrapper[4790]: I1011 10:57:31.396017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f5fb34-a451-48f6-91f4-60d27bfd939c" containerName="nova-scheduler-scheduler" Oct 11 10:57:31.396317 master-0 kubenswrapper[4790]: E1011 10:57:31.396118 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-log" Oct 11 10:57:31.396317 master-0 kubenswrapper[4790]: I1011 10:57:31.396134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-log" Oct 11 10:57:31.396317 master-0 kubenswrapper[4790]: E1011 10:57:31.396168 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-metadata" Oct 11 10:57:31.396317 master-0 kubenswrapper[4790]: I1011 10:57:31.396180 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-metadata" Oct 11 10:57:31.396617 master-0 kubenswrapper[4790]: I1011 10:57:31.396592 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-metadata" Oct 11 10:57:31.396654 master-0 kubenswrapper[4790]: I1011 10:57:31.396627 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" containerName="nova-metadata-log" Oct 11 10:57:31.396654 master-0 kubenswrapper[4790]: I1011 10:57:31.396641 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f5fb34-a451-48f6-91f4-60d27bfd939c" containerName="nova-scheduler-scheduler" Oct 11 10:57:31.398203 master-0 kubenswrapper[4790]: I1011 10:57:31.398161 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:57:31.410295 master-0 kubenswrapper[4790]: I1011 10:57:31.408725 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 11 10:57:31.426125 master-0 kubenswrapper[4790]: I1011 10:57:31.425820 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:57:31.528218 master-0 kubenswrapper[4790]: I1011 10:57:31.527796 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw66q\" (UniqueName: \"kubernetes.io/projected/b9948bac-db47-43c4-8ff5-611d5b07c46a-kube-api-access-qw66q\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.528218 master-0 kubenswrapper[4790]: I1011 10:57:31.528025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9948bac-db47-43c4-8ff5-611d5b07c46a-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.528218 master-0 kubenswrapper[4790]: I1011 10:57:31.528097 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9948bac-db47-43c4-8ff5-611d5b07c46a-config-data\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.567766 master-0 kubenswrapper[4790]: I1011 10:57:31.567593 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:57:31.572588 master-0 kubenswrapper[4790]: I1011 10:57:31.572520 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:57:31.597122 master-0 kubenswrapper[4790]: I1011 10:57:31.597034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:57:31.598581 master-0 kubenswrapper[4790]: I1011 10:57:31.598540 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:57:31.602154 master-0 kubenswrapper[4790]: I1011 10:57:31.602110 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 10:57:31.602766 master-0 kubenswrapper[4790]: I1011 10:57:31.602742 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 11 10:57:31.620193 master-0 kubenswrapper[4790]: I1011 10:57:31.620133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:57:31.635176 master-0 kubenswrapper[4790]: I1011 10:57:31.635115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-config-data\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.635176 master-0 kubenswrapper[4790]: I1011 10:57:31.635179 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.635581 master-0 kubenswrapper[4790]: I1011 10:57:31.635322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw66q\" (UniqueName: \"kubernetes.io/projected/b9948bac-db47-43c4-8ff5-611d5b07c46a-kube-api-access-qw66q\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.635581 master-0 kubenswrapper[4790]: I1011 10:57:31.635359 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-logs\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.635581 master-0 kubenswrapper[4790]: I1011 10:57:31.635398 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqr9\" (UniqueName: \"kubernetes.io/projected/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-kube-api-access-4jqr9\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.635581 master-0 kubenswrapper[4790]: I1011 10:57:31.635423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9948bac-db47-43c4-8ff5-611d5b07c46a-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.635581 master-0 kubenswrapper[4790]: I1011 10:57:31.635455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.635798 master-0 kubenswrapper[4790]: I1011 10:57:31.635577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9948bac-db47-43c4-8ff5-611d5b07c46a-config-data\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.639377 master-0 kubenswrapper[4790]: I1011 10:57:31.639343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9948bac-db47-43c4-8ff5-611d5b07c46a-config-data\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.640334 master-0 kubenswrapper[4790]: I1011 10:57:31.640282 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9948bac-db47-43c4-8ff5-611d5b07c46a-combined-ca-bundle\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.658636 master-0 kubenswrapper[4790]: I1011 10:57:31.658590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw66q\" (UniqueName: \"kubernetes.io/projected/b9948bac-db47-43c4-8ff5-611d5b07c46a-kube-api-access-qw66q\") pod \"nova-scheduler-2\" (UID: \"b9948bac-db47-43c4-8ff5-611d5b07c46a\") " pod="openstack/nova-scheduler-2" Oct 11 10:57:31.737065 master-0 kubenswrapper[4790]: I1011 10:57:31.736986 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-config-data\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.737065 master-0 kubenswrapper[4790]: I1011 10:57:31.737057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.737368 master-0 kubenswrapper[4790]: I1011 10:57:31.737145 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-logs\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.737368 master-0 kubenswrapper[4790]: I1011 10:57:31.737185 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqr9\" (UniqueName: \"kubernetes.io/projected/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-kube-api-access-4jqr9\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.737368 master-0 kubenswrapper[4790]: I1011 10:57:31.737216 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.738022 master-0 kubenswrapper[4790]: I1011 10:57:31.737985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-logs\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.741245 master-0 kubenswrapper[4790]: I1011 10:57:31.741199 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-config-data\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.741690 master-0 kubenswrapper[4790]: I1011 10:57:31.741650 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-nova-metadata-tls-certs\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.741770 master-0 kubenswrapper[4790]: I1011 10:57:31.741684 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-combined-ca-bundle\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.746000 master-0 kubenswrapper[4790]: I1011 10:57:31.745946 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-2" Oct 11 10:57:31.768032 master-0 kubenswrapper[4790]: I1011 10:57:31.767966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqr9\" (UniqueName: \"kubernetes.io/projected/720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a-kube-api-access-4jqr9\") pod \"nova-metadata-2\" (UID: \"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a\") " pod="openstack/nova-metadata-2" Oct 11 10:57:31.920526 master-0 kubenswrapper[4790]: I1011 10:57:31.919993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-2" Oct 11 10:57:32.240112 master-0 kubenswrapper[4790]: I1011 10:57:32.240036 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-2"] Oct 11 10:57:32.250492 master-0 kubenswrapper[4790]: W1011 10:57:32.250437 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9948bac_db47_43c4_8ff5_611d5b07c46a.slice/crio-1312fae28a8f53e7d49345f0efdd83d317a71323723d978a9985c0f156a93040 WatchSource:0}: Error finding container 1312fae28a8f53e7d49345f0efdd83d317a71323723d978a9985c0f156a93040: Status 404 returned error can't find the container with id 1312fae28a8f53e7d49345f0efdd83d317a71323723d978a9985c0f156a93040 Oct 11 10:57:32.302378 master-0 kubenswrapper[4790]: I1011 10:57:32.302322 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f5fb34-a451-48f6-91f4-60d27bfd939c" path="/var/lib/kubelet/pods/08f5fb34-a451-48f6-91f4-60d27bfd939c/volumes" Oct 11 10:57:32.303044 master-0 kubenswrapper[4790]: I1011 10:57:32.302976 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d221eb73-a42b-4c47-a912-4e47b88297a4" path="/var/lib/kubelet/pods/d221eb73-a42b-4c47-a912-4e47b88297a4/volumes" Oct 11 10:57:32.405637 master-0 kubenswrapper[4790]: I1011 10:57:32.405565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-2"] Oct 11 10:57:33.283639 master-0 kubenswrapper[4790]: I1011 10:57:33.283507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a","Type":"ContainerStarted","Data":"ae61c5515b9a2d374036db159ab569f90c948aa23970bdf132ca34ea4b15dba5"} Oct 11 10:57:33.284602 master-0 kubenswrapper[4790]: I1011 10:57:33.283656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a","Type":"ContainerStarted","Data":"eb81c163f9c6609135566926180071a0096d86a45537992958ef439100113427"} Oct 11 10:57:33.284602 master-0 kubenswrapper[4790]: I1011 10:57:33.283678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-2" event={"ID":"720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a","Type":"ContainerStarted","Data":"09d665290ab6ab8b3167b9064402e3deb7f949d44b8db9a7b0fef350f6ada2c2"} Oct 11 10:57:33.288309 master-0 kubenswrapper[4790]: I1011 10:57:33.288210 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"b9948bac-db47-43c4-8ff5-611d5b07c46a","Type":"ContainerStarted","Data":"83885f8b5bfcbb13a3cc61e2c11507b2a0f6ecbae2c6a4917e213d1c701fa61c"} Oct 11 10:57:33.288409 master-0 kubenswrapper[4790]: I1011 10:57:33.288318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-2" event={"ID":"b9948bac-db47-43c4-8ff5-611d5b07c46a","Type":"ContainerStarted","Data":"1312fae28a8f53e7d49345f0efdd83d317a71323723d978a9985c0f156a93040"} Oct 11 10:57:33.322652 master-0 kubenswrapper[4790]: I1011 10:57:33.322433 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-2" podStartSLOduration=2.322413584 podStartE2EDuration="2.322413584s" podCreationTimestamp="2025-10-11 10:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:57:33.316465419 +0000 UTC m=+1129.870925721" watchObservedRunningTime="2025-10-11 10:57:33.322413584 +0000 UTC m=+1129.876873876" Oct 11 10:57:33.368606 master-0 kubenswrapper[4790]: I1011 10:57:33.368421 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-2" podStartSLOduration=2.368376579 podStartE2EDuration="2.368376579s" podCreationTimestamp="2025-10-11 10:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:57:33.352866381 +0000 UTC m=+1129.907326763" watchObservedRunningTime="2025-10-11 10:57:33.368376579 +0000 UTC m=+1129.922836871" Oct 11 10:57:34.483017 master-0 kubenswrapper[4790]: I1011 10:57:34.482931 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Oct 11 10:57:34.483868 master-0 kubenswrapper[4790]: I1011 10:57:34.483832 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Oct 11 10:57:34.489412 master-0 kubenswrapper[4790]: I1011 10:57:34.489349 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Oct 11 10:57:34.493946 master-0 kubenswrapper[4790]: I1011 10:57:34.493885 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Oct 11 10:57:35.312304 master-0 kubenswrapper[4790]: I1011 10:57:35.312244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Oct 11 10:57:35.323800 master-0 kubenswrapper[4790]: I1011 10:57:35.323653 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Oct 11 10:57:36.746547 master-0 kubenswrapper[4790]: I1011 10:57:36.746421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-2" Oct 11 10:57:36.920566 master-0 kubenswrapper[4790]: I1011 10:57:36.920478 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-2" Oct 11 10:57:36.920877 master-0 kubenswrapper[4790]: I1011 10:57:36.920579 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-2" Oct 11 10:57:41.747315 master-0 kubenswrapper[4790]: I1011 10:57:41.747242 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-2" Oct 11 10:57:41.785606 master-0 kubenswrapper[4790]: I1011 10:57:41.784538 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-2" Oct 11 10:57:41.921357 master-0 kubenswrapper[4790]: I1011 10:57:41.921287 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-2" Oct 11 10:57:41.921357 master-0 kubenswrapper[4790]: I1011 10:57:41.921368 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-2" Oct 11 10:57:42.410732 master-0 kubenswrapper[4790]: I1011 10:57:42.410642 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-2" Oct 11 10:57:42.936057 master-0 kubenswrapper[4790]: I1011 10:57:42.935951 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-2" podUID="720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.130.0.120:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:57:42.936748 master-0 kubenswrapper[4790]: I1011 10:57:42.935988 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-2" podUID="720d2b85-b5e9-46cd-8d71-f5ef6a31cd5a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.130.0.120:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:57:49.818159 master-0 kubenswrapper[4790]: I1011 10:57:49.818045 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:49.819039 master-0 kubenswrapper[4790]: I1011 10:57:49.818457 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-1" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-log" containerID="cri-o://7c441b122cde11d0c0dd59e93d58166fba491455c97757aa05c250678bb0d192" gracePeriod=30 Oct 11 10:57:49.819222 master-0 kubenswrapper[4790]: I1011 10:57:49.819170 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-1" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-api" containerID="cri-o://9c19ce238ad447003ee3f2fc9c2488f5e38167272b8ba8dbab542ad52ae186e6" gracePeriod=30 Oct 11 10:57:50.472484 master-0 kubenswrapper[4790]: I1011 10:57:50.472394 4790 generic.go:334] "Generic (PLEG): container finished" podID="44bcb391-53f2-438c-b46e-1f3208011f01" containerID="7c441b122cde11d0c0dd59e93d58166fba491455c97757aa05c250678bb0d192" exitCode=143 Oct 11 10:57:50.472922 master-0 kubenswrapper[4790]: I1011 10:57:50.472486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"44bcb391-53f2-438c-b46e-1f3208011f01","Type":"ContainerDied","Data":"7c441b122cde11d0c0dd59e93d58166fba491455c97757aa05c250678bb0d192"} Oct 11 10:57:51.927788 master-0 kubenswrapper[4790]: I1011 10:57:51.927678 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-2" Oct 11 10:57:51.928415 master-0 kubenswrapper[4790]: I1011 10:57:51.927849 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-2" Oct 11 10:57:51.934613 master-0 kubenswrapper[4790]: I1011 10:57:51.934548 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-2" Oct 11 10:57:51.936820 master-0 kubenswrapper[4790]: I1011 10:57:51.936765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-2" Oct 11 10:57:53.533736 master-0 kubenswrapper[4790]: I1011 10:57:53.529889 4790 generic.go:334] "Generic (PLEG): container finished" podID="44bcb391-53f2-438c-b46e-1f3208011f01" containerID="9c19ce238ad447003ee3f2fc9c2488f5e38167272b8ba8dbab542ad52ae186e6" exitCode=0 Oct 11 10:57:53.533736 master-0 kubenswrapper[4790]: I1011 10:57:53.530335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"44bcb391-53f2-438c-b46e-1f3208011f01","Type":"ContainerDied","Data":"9c19ce238ad447003ee3f2fc9c2488f5e38167272b8ba8dbab542ad52ae186e6"} Oct 11 10:57:53.654191 master-0 kubenswrapper[4790]: I1011 10:57:53.654131 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:53.811679 master-0 kubenswrapper[4790]: I1011 10:57:53.811607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-internal-tls-certs\") pod \"44bcb391-53f2-438c-b46e-1f3208011f01\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " Oct 11 10:57:53.812024 master-0 kubenswrapper[4790]: I1011 10:57:53.811913 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bcb391-53f2-438c-b46e-1f3208011f01-logs\") pod \"44bcb391-53f2-438c-b46e-1f3208011f01\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " Oct 11 10:57:53.812024 master-0 kubenswrapper[4790]: I1011 10:57:53.811939 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-combined-ca-bundle\") pod \"44bcb391-53f2-438c-b46e-1f3208011f01\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " Oct 11 10:57:53.812024 master-0 kubenswrapper[4790]: I1011 10:57:53.811987 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-config-data\") pod \"44bcb391-53f2-438c-b46e-1f3208011f01\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " Oct 11 10:57:53.812130 master-0 kubenswrapper[4790]: I1011 10:57:53.812065 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8qlt\" (UniqueName: \"kubernetes.io/projected/44bcb391-53f2-438c-b46e-1f3208011f01-kube-api-access-h8qlt\") pod \"44bcb391-53f2-438c-b46e-1f3208011f01\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " Oct 11 10:57:53.812130 master-0 kubenswrapper[4790]: I1011 10:57:53.812097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-public-tls-certs\") pod \"44bcb391-53f2-438c-b46e-1f3208011f01\" (UID: \"44bcb391-53f2-438c-b46e-1f3208011f01\") " Oct 11 10:57:53.812720 master-0 kubenswrapper[4790]: I1011 10:57:53.812636 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bcb391-53f2-438c-b46e-1f3208011f01-logs" (OuterVolumeSpecName: "logs") pod "44bcb391-53f2-438c-b46e-1f3208011f01" (UID: "44bcb391-53f2-438c-b46e-1f3208011f01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 10:57:53.815904 master-0 kubenswrapper[4790]: I1011 10:57:53.815826 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bcb391-53f2-438c-b46e-1f3208011f01-kube-api-access-h8qlt" (OuterVolumeSpecName: "kube-api-access-h8qlt") pod "44bcb391-53f2-438c-b46e-1f3208011f01" (UID: "44bcb391-53f2-438c-b46e-1f3208011f01"). InnerVolumeSpecName "kube-api-access-h8qlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 10:57:53.834793 master-0 kubenswrapper[4790]: I1011 10:57:53.834673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-config-data" (OuterVolumeSpecName: "config-data") pod "44bcb391-53f2-438c-b46e-1f3208011f01" (UID: "44bcb391-53f2-438c-b46e-1f3208011f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:53.837092 master-0 kubenswrapper[4790]: I1011 10:57:53.837025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44bcb391-53f2-438c-b46e-1f3208011f01" (UID: "44bcb391-53f2-438c-b46e-1f3208011f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:53.851199 master-0 kubenswrapper[4790]: I1011 10:57:53.851113 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44bcb391-53f2-438c-b46e-1f3208011f01" (UID: "44bcb391-53f2-438c-b46e-1f3208011f01"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:53.855330 master-0 kubenswrapper[4790]: I1011 10:57:53.855247 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44bcb391-53f2-438c-b46e-1f3208011f01" (UID: "44bcb391-53f2-438c-b46e-1f3208011f01"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 10:57:53.914593 master-0 kubenswrapper[4790]: I1011 10:57:53.914455 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44bcb391-53f2-438c-b46e-1f3208011f01-logs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:53.914593 master-0 kubenswrapper[4790]: I1011 10:57:53.914513 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:53.914593 master-0 kubenswrapper[4790]: I1011 10:57:53.914528 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-config-data\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:53.914593 master-0 kubenswrapper[4790]: I1011 10:57:53.914538 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8qlt\" (UniqueName: \"kubernetes.io/projected/44bcb391-53f2-438c-b46e-1f3208011f01-kube-api-access-h8qlt\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:53.914593 master-0 kubenswrapper[4790]: I1011 10:57:53.914548 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:53.914593 master-0 kubenswrapper[4790]: I1011 10:57:53.914559 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bcb391-53f2-438c-b46e-1f3208011f01-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Oct 11 10:57:54.549464 master-0 kubenswrapper[4790]: I1011 10:57:54.549373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"44bcb391-53f2-438c-b46e-1f3208011f01","Type":"ContainerDied","Data":"94a5d59118af400f9a8aa989b9def37c55ec9402d3102f10a1ba404bedd55ff9"} Oct 11 10:57:54.549464 master-0 kubenswrapper[4790]: I1011 10:57:54.549444 4790 scope.go:117] "RemoveContainer" containerID="9c19ce238ad447003ee3f2fc9c2488f5e38167272b8ba8dbab542ad52ae186e6" Oct 11 10:57:54.550526 master-0 kubenswrapper[4790]: I1011 10:57:54.549638 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:54.573923 master-0 kubenswrapper[4790]: I1011 10:57:54.573860 4790 scope.go:117] "RemoveContainer" containerID="7c441b122cde11d0c0dd59e93d58166fba491455c97757aa05c250678bb0d192" Oct 11 10:57:54.587244 master-0 kubenswrapper[4790]: I1011 10:57:54.587056 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:54.609731 master-0 kubenswrapper[4790]: I1011 10:57:54.609631 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:54.624186 master-0 kubenswrapper[4790]: I1011 10:57:54.624096 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:54.624630 master-0 kubenswrapper[4790]: E1011 10:57:54.624591 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-api" Oct 11 10:57:54.624630 master-0 kubenswrapper[4790]: I1011 10:57:54.624616 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-api" Oct 11 10:57:54.624805 master-0 kubenswrapper[4790]: E1011 10:57:54.624642 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-log" Oct 11 10:57:54.624805 master-0 kubenswrapper[4790]: I1011 10:57:54.624654 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-log" Oct 11 10:57:54.624934 master-0 kubenswrapper[4790]: I1011 10:57:54.624875 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-api" Oct 11 10:57:54.624934 master-0 kubenswrapper[4790]: I1011 10:57:54.624908 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" containerName="nova-api-log" Oct 11 10:57:54.626055 master-0 kubenswrapper[4790]: I1011 10:57:54.626023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:54.629786 master-0 kubenswrapper[4790]: I1011 10:57:54.629735 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 10:57:54.630022 master-0 kubenswrapper[4790]: I1011 10:57:54.629972 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 11 10:57:54.630739 master-0 kubenswrapper[4790]: I1011 10:57:54.630656 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 11 10:57:54.642433 master-0 kubenswrapper[4790]: I1011 10:57:54.642361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:54.731952 master-0 kubenswrapper[4790]: I1011 10:57:54.731880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qzw6\" (UniqueName: \"kubernetes.io/projected/acf716b4-c93a-4303-ab38-507bbc33a8c6-kube-api-access-4qzw6\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.732195 master-0 kubenswrapper[4790]: I1011 10:57:54.731991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-internal-tls-certs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.732195 master-0 kubenswrapper[4790]: I1011 10:57:54.732052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-config-data\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.732195 master-0 kubenswrapper[4790]: I1011 10:57:54.732078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-public-tls-certs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.732195 master-0 kubenswrapper[4790]: I1011 10:57:54.732174 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf716b4-c93a-4303-ab38-507bbc33a8c6-logs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.732365 master-0 kubenswrapper[4790]: I1011 10:57:54.732208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.833946 master-0 kubenswrapper[4790]: I1011 10:57:54.833885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf716b4-c93a-4303-ab38-507bbc33a8c6-logs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.833946 master-0 kubenswrapper[4790]: I1011 10:57:54.833961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.834374 master-0 kubenswrapper[4790]: I1011 10:57:54.834036 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qzw6\" (UniqueName: \"kubernetes.io/projected/acf716b4-c93a-4303-ab38-507bbc33a8c6-kube-api-access-4qzw6\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.834374 master-0 kubenswrapper[4790]: I1011 10:57:54.834064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-internal-tls-certs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.834374 master-0 kubenswrapper[4790]: I1011 10:57:54.834102 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-config-data\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.834374 master-0 kubenswrapper[4790]: I1011 10:57:54.834123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-public-tls-certs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.834583 master-0 kubenswrapper[4790]: I1011 10:57:54.834524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf716b4-c93a-4303-ab38-507bbc33a8c6-logs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.838305 master-0 kubenswrapper[4790]: I1011 10:57:54.838229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-public-tls-certs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.839783 master-0 kubenswrapper[4790]: I1011 10:57:54.839359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-config-data\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.840313 master-0 kubenswrapper[4790]: I1011 10:57:54.840265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-internal-tls-certs\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.842331 master-0 kubenswrapper[4790]: I1011 10:57:54.842267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf716b4-c93a-4303-ab38-507bbc33a8c6-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.857583 master-0 kubenswrapper[4790]: I1011 10:57:54.857517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qzw6\" (UniqueName: \"kubernetes.io/projected/acf716b4-c93a-4303-ab38-507bbc33a8c6-kube-api-access-4qzw6\") pod \"nova-api-1\" (UID: \"acf716b4-c93a-4303-ab38-507bbc33a8c6\") " pod="openstack/nova-api-1" Oct 11 10:57:54.950902 master-0 kubenswrapper[4790]: I1011 10:57:54.950814 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Oct 11 10:57:55.381203 master-0 kubenswrapper[4790]: I1011 10:57:55.381138 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Oct 11 10:57:55.386077 master-0 kubenswrapper[4790]: W1011 10:57:55.386013 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf716b4_c93a_4303_ab38_507bbc33a8c6.slice/crio-f8c70f81f08b25cf8f98d4be418be2dd429ec74969110b9c3807243c1c9c37b2 WatchSource:0}: Error finding container f8c70f81f08b25cf8f98d4be418be2dd429ec74969110b9c3807243c1c9c37b2: Status 404 returned error can't find the container with id f8c70f81f08b25cf8f98d4be418be2dd429ec74969110b9c3807243c1c9c37b2 Oct 11 10:57:55.569843 master-0 kubenswrapper[4790]: I1011 10:57:55.568164 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"acf716b4-c93a-4303-ab38-507bbc33a8c6","Type":"ContainerStarted","Data":"0403695d910ac7dcbe6ecb8d65fbc7fad8faaaabae301a75a3da18e2abe2b981"} Oct 11 10:57:55.569843 master-0 kubenswrapper[4790]: I1011 10:57:55.568268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"acf716b4-c93a-4303-ab38-507bbc33a8c6","Type":"ContainerStarted","Data":"f8c70f81f08b25cf8f98d4be418be2dd429ec74969110b9c3807243c1c9c37b2"} Oct 11 10:57:56.304372 master-0 kubenswrapper[4790]: I1011 10:57:56.304307 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bcb391-53f2-438c-b46e-1f3208011f01" path="/var/lib/kubelet/pods/44bcb391-53f2-438c-b46e-1f3208011f01/volumes" Oct 11 10:57:56.585179 master-0 kubenswrapper[4790]: I1011 10:57:56.584974 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"acf716b4-c93a-4303-ab38-507bbc33a8c6","Type":"ContainerStarted","Data":"4d2db9ffc182292640a480f615fb53f7bfbe826c4265583c5552f11c703f2805"} Oct 11 10:57:56.619689 master-0 kubenswrapper[4790]: I1011 10:57:56.619566 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1" podStartSLOduration=2.619541555 podStartE2EDuration="2.619541555s" podCreationTimestamp="2025-10-11 10:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 10:57:56.61390477 +0000 UTC m=+1153.168365102" watchObservedRunningTime="2025-10-11 10:57:56.619541555 +0000 UTC m=+1153.174001847" Oct 11 10:58:04.951328 master-0 kubenswrapper[4790]: I1011 10:58:04.951244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Oct 11 10:58:04.951328 master-0 kubenswrapper[4790]: I1011 10:58:04.951312 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Oct 11 10:58:05.967098 master-0 kubenswrapper[4790]: I1011 10:58:05.966941 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="acf716b4-c93a-4303-ab38-507bbc33a8c6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.130.0.121:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 10:58:05.967098 master-0 kubenswrapper[4790]: I1011 10:58:05.967098 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="acf716b4-c93a-4303-ab38-507bbc33a8c6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.130.0.121:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 10:58:14.958910 master-0 kubenswrapper[4790]: I1011 10:58:14.958842 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Oct 11 10:58:14.959677 master-0 kubenswrapper[4790]: I1011 10:58:14.959410 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Oct 11 10:58:14.961228 master-0 kubenswrapper[4790]: I1011 10:58:14.961155 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Oct 11 10:58:14.966964 master-0 kubenswrapper[4790]: I1011 10:58:14.966913 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Oct 11 10:58:15.792586 master-0 kubenswrapper[4790]: I1011 10:58:15.792515 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Oct 11 10:58:15.799665 master-0 kubenswrapper[4790]: I1011 10:58:15.799595 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Oct 11 11:00:44.776342 master-0 kubenswrapper[4790]: I1011 11:00:44.776188 4790 scope.go:117] "RemoveContainer" containerID="cc0e410018cdbb38cb0a44455ce0c9bcffaa24fb5b85e7a4f71ece632724bed8" Oct 11 11:01:44.844414 master-0 kubenswrapper[4790]: I1011 11:01:44.844371 4790 scope.go:117] "RemoveContainer" containerID="d68ca291cf2b0da68d6a8d6c151be0aa939d7b802691ee00796f934bd8619918" Oct 11 11:03:03.623624 master-0 kubenswrapper[4790]: I1011 11:03:03.623497 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-j7sxz"] Oct 11 11:03:03.625949 master-0 kubenswrapper[4790]: I1011 11:03:03.625900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.632658 master-0 kubenswrapper[4790]: I1011 11:03:03.632598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 11 11:03:03.634039 master-0 kubenswrapper[4790]: I1011 11:03:03.633984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 11 11:03:03.634132 master-0 kubenswrapper[4790]: I1011 11:03:03.634073 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 11 11:03:03.640232 master-0 kubenswrapper[4790]: I1011 11:03:03.640145 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-j7sxz"] Oct 11 11:03:03.744309 master-0 kubenswrapper[4790]: I1011 11:03:03.744236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3240af3d-94ab-4045-9618-f4a58c53b5a0-hm-ports\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.744514 master-0 kubenswrapper[4790]: I1011 11:03:03.744328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3240af3d-94ab-4045-9618-f4a58c53b5a0-config-data\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.744589 master-0 kubenswrapper[4790]: I1011 11:03:03.744534 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3240af3d-94ab-4045-9618-f4a58c53b5a0-config-data-merged\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.744853 master-0 kubenswrapper[4790]: I1011 11:03:03.744812 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3240af3d-94ab-4045-9618-f4a58c53b5a0-scripts\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.847071 master-0 kubenswrapper[4790]: I1011 11:03:03.846979 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3240af3d-94ab-4045-9618-f4a58c53b5a0-hm-ports\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.847071 master-0 kubenswrapper[4790]: I1011 11:03:03.847073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3240af3d-94ab-4045-9618-f4a58c53b5a0-config-data\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.847398 master-0 kubenswrapper[4790]: I1011 11:03:03.847110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3240af3d-94ab-4045-9618-f4a58c53b5a0-config-data-merged\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.847398 master-0 kubenswrapper[4790]: I1011 11:03:03.847154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3240af3d-94ab-4045-9618-f4a58c53b5a0-scripts\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.848170 master-0 kubenswrapper[4790]: I1011 11:03:03.848105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3240af3d-94ab-4045-9618-f4a58c53b5a0-config-data-merged\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.848933 master-0 kubenswrapper[4790]: I1011 11:03:03.848867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3240af3d-94ab-4045-9618-f4a58c53b5a0-hm-ports\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.852280 master-0 kubenswrapper[4790]: I1011 11:03:03.852236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3240af3d-94ab-4045-9618-f4a58c53b5a0-config-data\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.874538 master-0 kubenswrapper[4790]: I1011 11:03:03.874399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3240af3d-94ab-4045-9618-f4a58c53b5a0-scripts\") pod \"octavia-rsyslog-j7sxz\" (UID: \"3240af3d-94ab-4045-9618-f4a58c53b5a0\") " pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:03.949533 master-0 kubenswrapper[4790]: I1011 11:03:03.949457 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:05.343921 master-0 kubenswrapper[4790]: I1011 11:03:05.343833 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-j7sxz"] Oct 11 11:03:05.355611 master-0 kubenswrapper[4790]: I1011 11:03:05.355557 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 11:03:05.800198 master-0 kubenswrapper[4790]: I1011 11:03:05.800140 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-j7sxz" event={"ID":"3240af3d-94ab-4045-9618-f4a58c53b5a0","Type":"ContainerStarted","Data":"8a82447c206637bb30e47630382a5d9161cf334f1ba255ad4e7a627d844d40f1"} Oct 11 11:03:13.868214 master-0 kubenswrapper[4790]: I1011 11:03:13.868125 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-j7sxz" event={"ID":"3240af3d-94ab-4045-9618-f4a58c53b5a0","Type":"ContainerStarted","Data":"33658efd0dee5c5cf0929135102f1452d4a0592380545f82fb598f9624bf85a0"} Oct 11 11:03:14.881384 master-0 kubenswrapper[4790]: I1011 11:03:14.879410 4790 generic.go:334] "Generic (PLEG): container finished" podID="3240af3d-94ab-4045-9618-f4a58c53b5a0" containerID="33658efd0dee5c5cf0929135102f1452d4a0592380545f82fb598f9624bf85a0" exitCode=0 Oct 11 11:03:14.881384 master-0 kubenswrapper[4790]: I1011 11:03:14.879490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-j7sxz" event={"ID":"3240af3d-94ab-4045-9618-f4a58c53b5a0","Type":"ContainerDied","Data":"33658efd0dee5c5cf0929135102f1452d4a0592380545f82fb598f9624bf85a0"} Oct 11 11:03:16.899859 master-0 kubenswrapper[4790]: I1011 11:03:16.899777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-j7sxz" event={"ID":"3240af3d-94ab-4045-9618-f4a58c53b5a0","Type":"ContainerStarted","Data":"61bff005e5fd07a8cb4da8a1244c56bec23767c401f084a3a78027ce157e8436"} Oct 11 11:03:16.900753 master-0 kubenswrapper[4790]: I1011 11:03:16.900055 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:16.944737 master-0 kubenswrapper[4790]: I1011 11:03:16.939415 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-j7sxz" podStartSLOduration=3.560733548 podStartE2EDuration="13.939358277s" podCreationTimestamp="2025-10-11 11:03:03 +0000 UTC" firstStartedPulling="2025-10-11 11:03:05.355470076 +0000 UTC m=+1461.909930368" lastFinishedPulling="2025-10-11 11:03:15.734094805 +0000 UTC m=+1472.288555097" observedRunningTime="2025-10-11 11:03:16.931240495 +0000 UTC m=+1473.485700797" watchObservedRunningTime="2025-10-11 11:03:16.939358277 +0000 UTC m=+1473.493818579" Oct 11 11:03:33.995434 master-0 kubenswrapper[4790]: I1011 11:03:33.995285 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-j7sxz" Oct 11 11:03:52.055075 master-0 kubenswrapper[4790]: I1011 11:03:52.054941 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-113b-account-create-twjxb"] Oct 11 11:03:52.068492 master-0 kubenswrapper[4790]: I1011 11:03:52.068369 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-113b-account-create-twjxb"] Oct 11 11:03:52.304290 master-0 kubenswrapper[4790]: I1011 11:03:52.304161 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70cbbe93-7c50-40cb-91f4-f75c8875580d" path="/var/lib/kubelet/pods/70cbbe93-7c50-40cb-91f4-f75c8875580d/volumes" Oct 11 11:03:59.047943 master-0 kubenswrapper[4790]: I1011 11:03:59.047864 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-af51-account-create-tz8f4"] Oct 11 11:03:59.055522 master-0 kubenswrapper[4790]: I1011 11:03:59.055463 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a7a-account-create-9c44k"] Oct 11 11:03:59.065297 master-0 kubenswrapper[4790]: I1011 11:03:59.065214 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-af51-account-create-tz8f4"] Oct 11 11:03:59.104467 master-0 kubenswrapper[4790]: I1011 11:03:59.104376 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0a7a-account-create-9c44k"] Oct 11 11:04:00.303666 master-0 kubenswrapper[4790]: I1011 11:04:00.303614 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4838cae2-31c3-4b4d-a914-e95b0b6308be" path="/var/lib/kubelet/pods/4838cae2-31c3-4b4d-a914-e95b0b6308be/volumes" Oct 11 11:04:00.304228 master-0 kubenswrapper[4790]: I1011 11:04:00.304174 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21954bc-9fb3-4d4e-8085-b2fcf628e0a5" path="/var/lib/kubelet/pods/c21954bc-9fb3-4d4e-8085-b2fcf628e0a5/volumes" Oct 11 11:04:08.758918 master-0 kubenswrapper[4790]: I1011 11:04:08.758862 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-wmmzh"] Oct 11 11:04:08.761879 master-0 kubenswrapper[4790]: I1011 11:04:08.761851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:08.768136 master-0 kubenswrapper[4790]: I1011 11:04:08.768059 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 11 11:04:08.768526 master-0 kubenswrapper[4790]: I1011 11:04:08.768488 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 11 11:04:08.768766 master-0 kubenswrapper[4790]: I1011 11:04:08.768695 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 11 11:04:08.824085 master-0 kubenswrapper[4790]: I1011 11:04:08.823943 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-wmmzh"] Oct 11 11:04:08.903000 master-0 kubenswrapper[4790]: I1011 11:04:08.902914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-scripts\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:08.903335 master-0 kubenswrapper[4790]: I1011 11:04:08.903030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-amphora-certs\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:08.903335 master-0 kubenswrapper[4790]: I1011 11:04:08.903113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-combined-ca-bundle\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:08.903335 master-0 kubenswrapper[4790]: I1011 11:04:08.903288 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-config-data\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:08.903542 master-0 kubenswrapper[4790]: I1011 11:04:08.903516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-hm-ports\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:08.903642 master-0 kubenswrapper[4790]: I1011 11:04:08.903614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-config-data-merged\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.005405 master-0 kubenswrapper[4790]: I1011 11:04:09.005237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-combined-ca-bundle\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.005405 master-0 kubenswrapper[4790]: I1011 11:04:09.005309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-config-data\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.005405 master-0 kubenswrapper[4790]: I1011 11:04:09.005348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-hm-ports\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.005405 master-0 kubenswrapper[4790]: I1011 11:04:09.005371 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-config-data-merged\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.005845 master-0 kubenswrapper[4790]: I1011 11:04:09.005441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-scripts\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.005845 master-0 kubenswrapper[4790]: I1011 11:04:09.005474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-amphora-certs\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.006483 master-0 kubenswrapper[4790]: I1011 11:04:09.006429 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-config-data-merged\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.007844 master-0 kubenswrapper[4790]: I1011 11:04:09.007786 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-hm-ports\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.010009 master-0 kubenswrapper[4790]: I1011 11:04:09.009975 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-combined-ca-bundle\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.010334 master-0 kubenswrapper[4790]: I1011 11:04:09.010297 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-amphora-certs\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.013074 master-0 kubenswrapper[4790]: I1011 11:04:09.013027 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-config-data\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.015985 master-0 kubenswrapper[4790]: I1011 11:04:09.015921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b23d0bf-c533-41d4-aa06-4cbb6bcda90d-scripts\") pod \"octavia-healthmanager-wmmzh\" (UID: \"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d\") " pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:09.107182 master-0 kubenswrapper[4790]: I1011 11:04:09.107050 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:10.586440 master-0 kubenswrapper[4790]: I1011 11:04:10.586377 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-wmmzh"] Oct 11 11:04:10.593048 master-0 kubenswrapper[4790]: W1011 11:04:10.592962 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b23d0bf_c533_41d4_aa06_4cbb6bcda90d.slice/crio-2932d56e27a8a6f750a2db321cab98a75e2c3e86af7cc35652f038c46413cb3f WatchSource:0}: Error finding container 2932d56e27a8a6f750a2db321cab98a75e2c3e86af7cc35652f038c46413cb3f: Status 404 returned error can't find the container with id 2932d56e27a8a6f750a2db321cab98a75e2c3e86af7cc35652f038c46413cb3f Oct 11 11:04:11.041740 master-0 kubenswrapper[4790]: I1011 11:04:11.040054 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-8wwvc"] Oct 11 11:04:11.042674 master-0 kubenswrapper[4790]: I1011 11:04:11.042592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.047198 master-0 kubenswrapper[4790]: I1011 11:04:11.047147 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 11 11:04:11.047505 master-0 kubenswrapper[4790]: I1011 11:04:11.047454 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 11 11:04:11.066598 master-0 kubenswrapper[4790]: I1011 11:04:11.066517 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-8wwvc"] Oct 11 11:04:11.161945 master-0 kubenswrapper[4790]: I1011 11:04:11.161884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cec8d92b-373a-45a6-926a-6e2ae2a2645d-config-data-merged\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.162249 master-0 kubenswrapper[4790]: I1011 11:04:11.161970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-combined-ca-bundle\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.162249 master-0 kubenswrapper[4790]: I1011 11:04:11.162044 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cec8d92b-373a-45a6-926a-6e2ae2a2645d-hm-ports\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.162249 master-0 kubenswrapper[4790]: I1011 11:04:11.162077 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-scripts\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.162249 master-0 kubenswrapper[4790]: I1011 11:04:11.162201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-amphora-certs\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.162249 master-0 kubenswrapper[4790]: I1011 11:04:11.162221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-config-data\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265204 master-0 kubenswrapper[4790]: I1011 11:04:11.265037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-amphora-certs\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265204 master-0 kubenswrapper[4790]: I1011 11:04:11.265090 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-config-data\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265204 master-0 kubenswrapper[4790]: I1011 11:04:11.265124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cec8d92b-373a-45a6-926a-6e2ae2a2645d-config-data-merged\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265204 master-0 kubenswrapper[4790]: I1011 11:04:11.265154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-combined-ca-bundle\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265204 master-0 kubenswrapper[4790]: I1011 11:04:11.265203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cec8d92b-373a-45a6-926a-6e2ae2a2645d-hm-ports\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265591 master-0 kubenswrapper[4790]: I1011 11:04:11.265245 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-scripts\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.265973 master-0 kubenswrapper[4790]: I1011 11:04:11.265935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cec8d92b-373a-45a6-926a-6e2ae2a2645d-config-data-merged\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.270539 master-0 kubenswrapper[4790]: I1011 11:04:11.270494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cec8d92b-373a-45a6-926a-6e2ae2a2645d-hm-ports\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.278728 master-0 kubenswrapper[4790]: I1011 11:04:11.274800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-scripts\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.278728 master-0 kubenswrapper[4790]: I1011 11:04:11.276056 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-amphora-certs\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.278728 master-0 kubenswrapper[4790]: I1011 11:04:11.276402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-combined-ca-bundle\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.278728 master-0 kubenswrapper[4790]: I1011 11:04:11.270512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec8d92b-373a-45a6-926a-6e2ae2a2645d-config-data\") pod \"octavia-housekeeping-8wwvc\" (UID: \"cec8d92b-373a-45a6-926a-6e2ae2a2645d\") " pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.389701 master-0 kubenswrapper[4790]: I1011 11:04:11.387967 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:11.565806 master-0 kubenswrapper[4790]: I1011 11:04:11.565634 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wmmzh" event={"ID":"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d","Type":"ContainerStarted","Data":"197ff42e3cee7c14a2bd909bd444bf54fe56101f70a5136c28611d688198512d"} Oct 11 11:04:11.565806 master-0 kubenswrapper[4790]: I1011 11:04:11.565726 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wmmzh" event={"ID":"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d","Type":"ContainerStarted","Data":"2932d56e27a8a6f750a2db321cab98a75e2c3e86af7cc35652f038c46413cb3f"} Oct 11 11:04:12.840266 master-0 kubenswrapper[4790]: I1011 11:04:12.839993 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-bp8q6"] Oct 11 11:04:12.844508 master-0 kubenswrapper[4790]: I1011 11:04:12.844425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:12.850035 master-0 kubenswrapper[4790]: I1011 11:04:12.849975 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 11 11:04:12.851209 master-0 kubenswrapper[4790]: I1011 11:04:12.851187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 11 11:04:12.885327 master-0 kubenswrapper[4790]: I1011 11:04:12.885274 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-bp8q6"] Oct 11 11:04:12.957975 master-0 kubenswrapper[4790]: I1011 11:04:12.957907 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-8wwvc"] Oct 11 11:04:13.014933 master-0 kubenswrapper[4790]: I1011 11:04:13.014862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-amphora-certs\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.015148 master-0 kubenswrapper[4790]: I1011 11:04:13.014993 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-scripts\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.015148 master-0 kubenswrapper[4790]: I1011 11:04:13.015063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-combined-ca-bundle\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.015148 master-0 kubenswrapper[4790]: I1011 11:04:13.015099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/943bdd13-9252-4c87-b669-9cf3f566e2ec-config-data-merged\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.015148 master-0 kubenswrapper[4790]: I1011 11:04:13.015124 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/943bdd13-9252-4c87-b669-9cf3f566e2ec-hm-ports\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.015148 master-0 kubenswrapper[4790]: I1011 11:04:13.015149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-config-data\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.117481 master-0 kubenswrapper[4790]: I1011 11:04:13.117313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-scripts\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.117481 master-0 kubenswrapper[4790]: I1011 11:04:13.117445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-combined-ca-bundle\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.117750 master-0 kubenswrapper[4790]: I1011 11:04:13.117502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/943bdd13-9252-4c87-b669-9cf3f566e2ec-config-data-merged\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.117750 master-0 kubenswrapper[4790]: I1011 11:04:13.117550 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/943bdd13-9252-4c87-b669-9cf3f566e2ec-hm-ports\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.117750 master-0 kubenswrapper[4790]: I1011 11:04:13.117588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-config-data\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.117750 master-0 kubenswrapper[4790]: I1011 11:04:13.117627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-amphora-certs\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.119256 master-0 kubenswrapper[4790]: I1011 11:04:13.119222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/943bdd13-9252-4c87-b669-9cf3f566e2ec-config-data-merged\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.121447 master-0 kubenswrapper[4790]: I1011 11:04:13.121406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/943bdd13-9252-4c87-b669-9cf3f566e2ec-hm-ports\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.121936 master-0 kubenswrapper[4790]: I1011 11:04:13.121883 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-combined-ca-bundle\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.122256 master-0 kubenswrapper[4790]: I1011 11:04:13.122240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-config-data\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.123411 master-0 kubenswrapper[4790]: I1011 11:04:13.123340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-amphora-certs\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.124014 master-0 kubenswrapper[4790]: I1011 11:04:13.123978 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943bdd13-9252-4c87-b669-9cf3f566e2ec-scripts\") pod \"octavia-worker-bp8q6\" (UID: \"943bdd13-9252-4c87-b669-9cf3f566e2ec\") " pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.305480 master-0 kubenswrapper[4790]: I1011 11:04:13.305394 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:13.592991 master-0 kubenswrapper[4790]: I1011 11:04:13.592906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-8wwvc" event={"ID":"cec8d92b-373a-45a6-926a-6e2ae2a2645d","Type":"ContainerStarted","Data":"f8b26c91595fdeca6125ac3fec47c1996dcc05da06ddd08630f24201c2b98afe"} Oct 11 11:04:13.600062 master-0 kubenswrapper[4790]: I1011 11:04:13.599548 4790 generic.go:334] "Generic (PLEG): container finished" podID="2b23d0bf-c533-41d4-aa06-4cbb6bcda90d" containerID="197ff42e3cee7c14a2bd909bd444bf54fe56101f70a5136c28611d688198512d" exitCode=0 Oct 11 11:04:13.600062 master-0 kubenswrapper[4790]: I1011 11:04:13.599626 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wmmzh" event={"ID":"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d","Type":"ContainerDied","Data":"197ff42e3cee7c14a2bd909bd444bf54fe56101f70a5136c28611d688198512d"} Oct 11 11:04:13.872681 master-0 kubenswrapper[4790]: I1011 11:04:13.872585 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-bp8q6"] Oct 11 11:04:13.874316 master-0 kubenswrapper[4790]: W1011 11:04:13.874213 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943bdd13_9252_4c87_b669_9cf3f566e2ec.slice/crio-f515c27a0294a207e428aa07aa8c085866d42831b8cf9ed0b011d7785d14e28d WatchSource:0}: Error finding container f515c27a0294a207e428aa07aa8c085866d42831b8cf9ed0b011d7785d14e28d: Status 404 returned error can't find the container with id f515c27a0294a207e428aa07aa8c085866d42831b8cf9ed0b011d7785d14e28d Oct 11 11:04:14.072183 master-0 kubenswrapper[4790]: I1011 11:04:14.072075 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jmmst"] Oct 11 11:04:14.079376 master-0 kubenswrapper[4790]: I1011 11:04:14.079300 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-l7rcp"] Oct 11 11:04:14.087908 master-0 kubenswrapper[4790]: I1011 11:04:14.087760 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jmmst"] Oct 11 11:04:14.091793 master-0 kubenswrapper[4790]: I1011 11:04:14.091724 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-l7rcp"] Oct 11 11:04:14.312292 master-0 kubenswrapper[4790]: I1011 11:04:14.312120 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40b588a-5009-41c8-b8b0-b417de6693ac" path="/var/lib/kubelet/pods/d40b588a-5009-41c8-b8b0-b417de6693ac/volumes" Oct 11 11:04:14.313755 master-0 kubenswrapper[4790]: I1011 11:04:14.313696 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe51cce-787f-4883-8b8f-f1ed50caa3d3" path="/var/lib/kubelet/pods/dfe51cce-787f-4883-8b8f-f1ed50caa3d3/volumes" Oct 11 11:04:14.622499 master-0 kubenswrapper[4790]: I1011 11:04:14.622425 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-bp8q6" event={"ID":"943bdd13-9252-4c87-b669-9cf3f566e2ec","Type":"ContainerStarted","Data":"f515c27a0294a207e428aa07aa8c085866d42831b8cf9ed0b011d7785d14e28d"} Oct 11 11:04:14.629631 master-0 kubenswrapper[4790]: I1011 11:04:14.629562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wmmzh" event={"ID":"2b23d0bf-c533-41d4-aa06-4cbb6bcda90d","Type":"ContainerStarted","Data":"c907e97fe44860901f875d6d455a963e45f0d2b6a7ae64e83df0f6564e422b34"} Oct 11 11:04:14.630930 master-0 kubenswrapper[4790]: I1011 11:04:14.630887 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:14.658421 master-0 kubenswrapper[4790]: I1011 11:04:14.658343 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-wmmzh" podStartSLOduration=6.658322955 podStartE2EDuration="6.658322955s" podCreationTimestamp="2025-10-11 11:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 11:04:14.654265474 +0000 UTC m=+1531.208725776" watchObservedRunningTime="2025-10-11 11:04:14.658322955 +0000 UTC m=+1531.212783247" Oct 11 11:04:15.043863 master-0 kubenswrapper[4790]: I1011 11:04:15.043817 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-7vsxp"] Oct 11 11:04:15.056687 master-0 kubenswrapper[4790]: I1011 11:04:15.056589 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-7vsxp"] Oct 11 11:04:15.642779 master-0 kubenswrapper[4790]: I1011 11:04:15.642652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-8wwvc" event={"ID":"cec8d92b-373a-45a6-926a-6e2ae2a2645d","Type":"ContainerStarted","Data":"41a01c86c951fd5fa24d58edf7651edb8ffe47ee4cc4c4a17c703d5b3d2e1523"} Oct 11 11:04:16.306681 master-0 kubenswrapper[4790]: I1011 11:04:16.306603 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58f3b14-e8da-4046-afb1-c376a65ef16e" path="/var/lib/kubelet/pods/d58f3b14-e8da-4046-afb1-c376a65ef16e/volumes" Oct 11 11:04:16.653650 master-0 kubenswrapper[4790]: I1011 11:04:16.653443 4790 generic.go:334] "Generic (PLEG): container finished" podID="cec8d92b-373a-45a6-926a-6e2ae2a2645d" containerID="41a01c86c951fd5fa24d58edf7651edb8ffe47ee4cc4c4a17c703d5b3d2e1523" exitCode=0 Oct 11 11:04:16.653650 master-0 kubenswrapper[4790]: I1011 11:04:16.653528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-8wwvc" event={"ID":"cec8d92b-373a-45a6-926a-6e2ae2a2645d","Type":"ContainerDied","Data":"41a01c86c951fd5fa24d58edf7651edb8ffe47ee4cc4c4a17c703d5b3d2e1523"} Oct 11 11:04:16.656444 master-0 kubenswrapper[4790]: I1011 11:04:16.656367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-bp8q6" event={"ID":"943bdd13-9252-4c87-b669-9cf3f566e2ec","Type":"ContainerStarted","Data":"ea93697b9c384496775b0ae62c141aa148f53ec6c485357004d0d0c95b051d40"} Oct 11 11:04:17.669410 master-0 kubenswrapper[4790]: I1011 11:04:17.669329 4790 generic.go:334] "Generic (PLEG): container finished" podID="943bdd13-9252-4c87-b669-9cf3f566e2ec" containerID="ea93697b9c384496775b0ae62c141aa148f53ec6c485357004d0d0c95b051d40" exitCode=0 Oct 11 11:04:17.670055 master-0 kubenswrapper[4790]: I1011 11:04:17.669429 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-bp8q6" event={"ID":"943bdd13-9252-4c87-b669-9cf3f566e2ec","Type":"ContainerDied","Data":"ea93697b9c384496775b0ae62c141aa148f53ec6c485357004d0d0c95b051d40"} Oct 11 11:04:17.681224 master-0 kubenswrapper[4790]: I1011 11:04:17.681155 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-8wwvc" event={"ID":"cec8d92b-373a-45a6-926a-6e2ae2a2645d","Type":"ContainerStarted","Data":"3ff4efd92c1cfb0a3fce962e4c178a773e76ade548c2a57f01380d3b0f25f161"} Oct 11 11:04:17.682162 master-0 kubenswrapper[4790]: I1011 11:04:17.682128 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:17.796419 master-0 kubenswrapper[4790]: I1011 11:04:17.796281 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-8wwvc" podStartSLOduration=5.22956999 podStartE2EDuration="6.796259991s" podCreationTimestamp="2025-10-11 11:04:11 +0000 UTC" firstStartedPulling="2025-10-11 11:04:12.974605995 +0000 UTC m=+1529.529066287" lastFinishedPulling="2025-10-11 11:04:14.541295996 +0000 UTC m=+1531.095756288" observedRunningTime="2025-10-11 11:04:17.794509013 +0000 UTC m=+1534.348969325" watchObservedRunningTime="2025-10-11 11:04:17.796259991 +0000 UTC m=+1534.350720283" Oct 11 11:04:18.702591 master-0 kubenswrapper[4790]: I1011 11:04:18.702434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-bp8q6" event={"ID":"943bdd13-9252-4c87-b669-9cf3f566e2ec","Type":"ContainerStarted","Data":"6a1fa28481cc708f6993ce48dafaee0a728e36376439b0a4cc765b441e1c044e"} Oct 11 11:04:18.735081 master-0 kubenswrapper[4790]: I1011 11:04:18.734952 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-bp8q6" podStartSLOduration=4.585599372 podStartE2EDuration="6.734928707s" podCreationTimestamp="2025-10-11 11:04:12 +0000 UTC" firstStartedPulling="2025-10-11 11:04:13.878980384 +0000 UTC m=+1530.433440676" lastFinishedPulling="2025-10-11 11:04:16.028309719 +0000 UTC m=+1532.582770011" observedRunningTime="2025-10-11 11:04:18.731270397 +0000 UTC m=+1535.285730709" watchObservedRunningTime="2025-10-11 11:04:18.734928707 +0000 UTC m=+1535.289388999" Oct 11 11:04:19.713594 master-0 kubenswrapper[4790]: I1011 11:04:19.713498 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:24.142192 master-0 kubenswrapper[4790]: I1011 11:04:24.142088 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-wmmzh" Oct 11 11:04:26.054309 master-0 kubenswrapper[4790]: I1011 11:04:26.054232 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-gvzlv"] Oct 11 11:04:26.066948 master-0 kubenswrapper[4790]: I1011 11:04:26.066754 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-gvzlv"] Oct 11 11:04:26.307269 master-0 kubenswrapper[4790]: I1011 11:04:26.307129 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2137512f-c759-4935-944d-48248c99c2ec" path="/var/lib/kubelet/pods/2137512f-c759-4935-944d-48248c99c2ec/volumes" Oct 11 11:04:26.437891 master-0 kubenswrapper[4790]: I1011 11:04:26.437828 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-8wwvc" Oct 11 11:04:28.335014 master-0 kubenswrapper[4790]: I1011 11:04:28.334943 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-bp8q6" Oct 11 11:04:33.062529 master-0 kubenswrapper[4790]: I1011 11:04:33.062204 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b634-account-create-vb2w7"] Oct 11 11:04:33.070926 master-0 kubenswrapper[4790]: I1011 11:04:33.070852 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-9ac8-account-create-r5rxs"] Oct 11 11:04:33.077644 master-0 kubenswrapper[4790]: I1011 11:04:33.077551 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2033-account-create-jh9gc"] Oct 11 11:04:33.094253 master-0 kubenswrapper[4790]: I1011 11:04:33.094115 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b634-account-create-vb2w7"] Oct 11 11:04:33.094253 master-0 kubenswrapper[4790]: I1011 11:04:33.094218 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2033-account-create-jh9gc"] Oct 11 11:04:33.101080 master-0 kubenswrapper[4790]: I1011 11:04:33.101020 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-9ac8-account-create-r5rxs"] Oct 11 11:04:34.313336 master-0 kubenswrapper[4790]: I1011 11:04:34.313198 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a325c6-b9b6-495b-87dc-d6e12b3f1029" path="/var/lib/kubelet/pods/08a325c6-b9b6-495b-87dc-d6e12b3f1029/volumes" Oct 11 11:04:34.314538 master-0 kubenswrapper[4790]: I1011 11:04:34.314472 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ddf95f-6e9c-4f3c-b742-87379c6594b2" path="/var/lib/kubelet/pods/09ddf95f-6e9c-4f3c-b742-87379c6594b2/volumes" Oct 11 11:04:34.315648 master-0 kubenswrapper[4790]: I1011 11:04:34.315581 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdd4a60e-f24a-48fe-afcb-c7ccab615f69" path="/var/lib/kubelet/pods/cdd4a60e-f24a-48fe-afcb-c7ccab615f69/volumes" Oct 11 11:04:36.058133 master-0 kubenswrapper[4790]: I1011 11:04:36.058039 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-e1bf-account-create-9qds8"] Oct 11 11:04:36.065490 master-0 kubenswrapper[4790]: I1011 11:04:36.065417 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-e1bf-account-create-9qds8"] Oct 11 11:04:36.305950 master-0 kubenswrapper[4790]: I1011 11:04:36.305887 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b3f6bf-ef4b-41fa-b098-fc5620a92300" path="/var/lib/kubelet/pods/03b3f6bf-ef4b-41fa-b098-fc5620a92300/volumes" Oct 11 11:04:44.945331 master-0 kubenswrapper[4790]: I1011 11:04:44.945238 4790 scope.go:117] "RemoveContainer" containerID="050e70bb3b6a03db41f9fcb784b5238c3ff9d94ed85c503d6f9f58f7bd27daa0" Oct 11 11:04:44.976955 master-0 kubenswrapper[4790]: I1011 11:04:44.976887 4790 scope.go:117] "RemoveContainer" containerID="4a77a0e25a1bbd76eb350e88d6052fb5f4963ac556fb275beeaf9d30c06320df" Oct 11 11:04:45.035420 master-0 kubenswrapper[4790]: I1011 11:04:45.035258 4790 scope.go:117] "RemoveContainer" containerID="3b1f890304ef9d089479614d38d81aabcf093a42157b52f334fa1a99c8f86aa6" Oct 11 11:04:45.072371 master-0 kubenswrapper[4790]: I1011 11:04:45.072316 4790 scope.go:117] "RemoveContainer" containerID="07ec7b09db6fb5294fadc7bd8337f6b789e9b95a2303619665336b8735fa4bfe" Oct 11 11:04:45.096264 master-0 kubenswrapper[4790]: I1011 11:04:45.096218 4790 scope.go:117] "RemoveContainer" containerID="f18d3e7808bc3bd6d8d3dfcffe3def526d4ab16b836ac39e9bb14dfceb0d8247" Oct 11 11:04:45.128765 master-0 kubenswrapper[4790]: I1011 11:04:45.128736 4790 scope.go:117] "RemoveContainer" containerID="fd9735379426d4418da18546aac8b7806a6015a386e483f957b980d675840314" Oct 11 11:04:45.158745 master-0 kubenswrapper[4790]: I1011 11:04:45.158602 4790 scope.go:117] "RemoveContainer" containerID="677c51ee7ba248bdecdce7b7bb9d050175056a091f08201d76d54e3406eb2697" Oct 11 11:04:45.180579 master-0 kubenswrapper[4790]: I1011 11:04:45.180449 4790 scope.go:117] "RemoveContainer" containerID="1cde782f190214155e020d24bbe2e2d5c9f2dc24b3fea8e9236ee944da092a1c" Oct 11 11:04:45.202289 master-0 kubenswrapper[4790]: I1011 11:04:45.202208 4790 scope.go:117] "RemoveContainer" containerID="fa9c7f461b0e315bcd532cda39de483b7f3baaed2714bed160ee9f75fc0f43db" Oct 11 11:04:45.223748 master-0 kubenswrapper[4790]: I1011 11:04:45.223677 4790 scope.go:117] "RemoveContainer" containerID="6c3235d5d6bf2dc75b5b651ccfac846d26bd4957ea4e687fad602fa916728d6b" Oct 11 11:04:45.242387 master-0 kubenswrapper[4790]: I1011 11:04:45.242353 4790 scope.go:117] "RemoveContainer" containerID="5ca20afffe15faa31f5c2c1443a96be8fe5b0268275280368238f1f4b32ef4f2" Oct 11 11:05:08.481964 master-0 kubenswrapper[4790]: I1011 11:05:08.479054 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb9b8c955-g7qzg"] Oct 11 11:05:08.481964 master-0 kubenswrapper[4790]: I1011 11:05:08.479950 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerName="dnsmasq-dns" containerID="cri-o://0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c" gracePeriod=10 Oct 11 11:05:09.118018 master-0 kubenswrapper[4790]: I1011 11:05:09.117941 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 11:05:09.211572 master-0 kubenswrapper[4790]: I1011 11:05:09.211487 4790 generic.go:334] "Generic (PLEG): container finished" podID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerID="0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c" exitCode=0 Oct 11 11:05:09.211572 master-0 kubenswrapper[4790]: I1011 11:05:09.211562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" event={"ID":"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c","Type":"ContainerDied","Data":"0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c"} Oct 11 11:05:09.211947 master-0 kubenswrapper[4790]: I1011 11:05:09.211600 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" Oct 11 11:05:09.211947 master-0 kubenswrapper[4790]: I1011 11:05:09.211608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb9b8c955-g7qzg" event={"ID":"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c","Type":"ContainerDied","Data":"82ab564208fa75b6a416368edc3991b9aae0b1bdbf1f7ab61745c571e8067316"} Oct 11 11:05:09.211947 master-0 kubenswrapper[4790]: I1011 11:05:09.211625 4790 scope.go:117] "RemoveContainer" containerID="0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c" Oct 11 11:05:09.232183 master-0 kubenswrapper[4790]: I1011 11:05:09.232124 4790 scope.go:117] "RemoveContainer" containerID="148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079" Oct 11 11:05:09.258404 master-0 kubenswrapper[4790]: I1011 11:05:09.258334 4790 scope.go:117] "RemoveContainer" containerID="0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c" Oct 11 11:05:09.259377 master-0 kubenswrapper[4790]: E1011 11:05:09.259296 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c\": container with ID starting with 0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c not found: ID does not exist" containerID="0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c" Oct 11 11:05:09.259452 master-0 kubenswrapper[4790]: I1011 11:05:09.259404 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-svc\") pod \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " Oct 11 11:05:09.259502 master-0 kubenswrapper[4790]: I1011 11:05:09.259475 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-config\") pod \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " Oct 11 11:05:09.259547 master-0 kubenswrapper[4790]: I1011 11:05:09.259407 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c"} err="failed to get container status \"0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c\": rpc error: code = NotFound desc = could not find container \"0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c\": container with ID starting with 0bbeab6347207b28833870198731d207ead777fe2d62b52c20f939994946673c not found: ID does not exist" Oct 11 11:05:09.259603 master-0 kubenswrapper[4790]: I1011 11:05:09.259575 4790 scope.go:117] "RemoveContainer" containerID="148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079" Oct 11 11:05:09.259780 master-0 kubenswrapper[4790]: I1011 11:05:09.259740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4srbk\" (UniqueName: \"kubernetes.io/projected/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-kube-api-access-4srbk\") pod \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " Oct 11 11:05:09.260304 master-0 kubenswrapper[4790]: I1011 11:05:09.260263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-sb\") pod \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " Oct 11 11:05:09.260366 master-0 kubenswrapper[4790]: I1011 11:05:09.260326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-swift-storage-0\") pod \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " Oct 11 11:05:09.260414 master-0 kubenswrapper[4790]: E1011 11:05:09.260350 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079\": container with ID starting with 148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079 not found: ID does not exist" containerID="148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079" Oct 11 11:05:09.260487 master-0 kubenswrapper[4790]: I1011 11:05:09.260433 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079"} err="failed to get container status \"148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079\": rpc error: code = NotFound desc = could not find container \"148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079\": container with ID starting with 148a01e0193a2980ace147ec2984826e51e9857bd60dd500d2908e53b675c079 not found: ID does not exist" Oct 11 11:05:09.260487 master-0 kubenswrapper[4790]: I1011 11:05:09.260372 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-nb\") pod \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\" (UID: \"aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c\") " Oct 11 11:05:09.263221 master-0 kubenswrapper[4790]: I1011 11:05:09.263144 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-kube-api-access-4srbk" (OuterVolumeSpecName: "kube-api-access-4srbk") pod "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" (UID: "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c"). InnerVolumeSpecName "kube-api-access-4srbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 11:05:09.301046 master-0 kubenswrapper[4790]: I1011 11:05:09.300967 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" (UID: "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 11:05:09.303653 master-0 kubenswrapper[4790]: I1011 11:05:09.303605 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" (UID: "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 11:05:09.304953 master-0 kubenswrapper[4790]: I1011 11:05:09.304902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" (UID: "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 11:05:09.312058 master-0 kubenswrapper[4790]: I1011 11:05:09.311989 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-config" (OuterVolumeSpecName: "config") pod "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" (UID: "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 11:05:09.316951 master-0 kubenswrapper[4790]: I1011 11:05:09.316878 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" (UID: "aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 11:05:09.363912 master-0 kubenswrapper[4790]: I1011 11:05:09.363823 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4srbk\" (UniqueName: \"kubernetes.io/projected/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-kube-api-access-4srbk\") on node \"master-0\" DevicePath \"\"" Oct 11 11:05:09.363912 master-0 kubenswrapper[4790]: I1011 11:05:09.363880 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Oct 11 11:05:09.363912 master-0 kubenswrapper[4790]: I1011 11:05:09.363899 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Oct 11 11:05:09.363912 master-0 kubenswrapper[4790]: I1011 11:05:09.363913 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Oct 11 11:05:09.363912 master-0 kubenswrapper[4790]: I1011 11:05:09.363926 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-dns-svc\") on node \"master-0\" DevicePath \"\"" Oct 11 11:05:09.363912 master-0 kubenswrapper[4790]: I1011 11:05:09.363938 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c-config\") on node \"master-0\" DevicePath \"\"" Oct 11 11:05:09.561992 master-0 kubenswrapper[4790]: I1011 11:05:09.561847 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb9b8c955-g7qzg"] Oct 11 11:05:09.569246 master-0 kubenswrapper[4790]: I1011 11:05:09.569143 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb9b8c955-g7qzg"] Oct 11 11:05:10.307169 master-0 kubenswrapper[4790]: I1011 11:05:10.307057 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" path="/var/lib/kubelet/pods/aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c/volumes" Oct 11 11:05:19.566788 master-0 kubenswrapper[4790]: I1011 11:05:19.566678 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f984c5fd9-ljt8j"] Oct 11 11:05:19.567533 master-0 kubenswrapper[4790]: E1011 11:05:19.567225 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerName="init" Oct 11 11:05:19.567533 master-0 kubenswrapper[4790]: I1011 11:05:19.567250 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerName="init" Oct 11 11:05:19.567533 master-0 kubenswrapper[4790]: E1011 11:05:19.567300 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerName="dnsmasq-dns" Oct 11 11:05:19.567533 master-0 kubenswrapper[4790]: I1011 11:05:19.567313 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerName="dnsmasq-dns" Oct 11 11:05:19.567671 master-0 kubenswrapper[4790]: I1011 11:05:19.567612 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea1c85b-85f7-4996-bf3b-7f5ecf1e1d8c" containerName="dnsmasq-dns" Oct 11 11:05:19.569311 master-0 kubenswrapper[4790]: I1011 11:05:19.569275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.573324 master-0 kubenswrapper[4790]: I1011 11:05:19.573280 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 11 11:05:19.573666 master-0 kubenswrapper[4790]: I1011 11:05:19.573636 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm" Oct 11 11:05:19.573912 master-0 kubenswrapper[4790]: I1011 11:05:19.573850 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 11:05:19.574256 master-0 kubenswrapper[4790]: I1011 11:05:19.574221 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"networkers" Oct 11 11:05:19.575608 master-0 kubenswrapper[4790]: I1011 11:05:19.575537 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 11:05:19.575796 master-0 kubenswrapper[4790]: I1011 11:05:19.575758 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 11:05:19.576399 master-0 kubenswrapper[4790]: I1011 11:05:19.576344 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 11:05:19.585454 master-0 kubenswrapper[4790]: I1011 11:05:19.585387 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f984c5fd9-ljt8j"] Oct 11 11:05:19.741147 master-0 kubenswrapper[4790]: I1011 11:05:19.741053 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-ovsdbserver-sb\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741147 master-0 kubenswrapper[4790]: I1011 11:05:19.741131 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-dns-swift-storage-0\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741495 master-0 kubenswrapper[4790]: I1011 11:05:19.741171 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-dns-svc\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741495 master-0 kubenswrapper[4790]: I1011 11:05:19.741215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-config\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741495 master-0 kubenswrapper[4790]: I1011 11:05:19.741245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networkers\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-networkers\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741495 master-0 kubenswrapper[4790]: I1011 11:05:19.741267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-edpm\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741495 master-0 kubenswrapper[4790]: I1011 11:05:19.741292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4r2n\" (UniqueName: \"kubernetes.io/projected/94ed578b-910e-4144-a5c9-6d5e7a585b3d-kube-api-access-l4r2n\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.741495 master-0 kubenswrapper[4790]: I1011 11:05:19.741319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-ovsdbserver-nb\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843455 master-0 kubenswrapper[4790]: I1011 11:05:19.843299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-ovsdbserver-sb\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843455 master-0 kubenswrapper[4790]: I1011 11:05:19.843380 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-dns-swift-storage-0\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843455 master-0 kubenswrapper[4790]: I1011 11:05:19.843426 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-dns-svc\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843771 master-0 kubenswrapper[4790]: I1011 11:05:19.843462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-config\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843771 master-0 kubenswrapper[4790]: I1011 11:05:19.843496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networkers\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-networkers\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843771 master-0 kubenswrapper[4790]: I1011 11:05:19.843521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-edpm\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843771 master-0 kubenswrapper[4790]: I1011 11:05:19.843549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4r2n\" (UniqueName: \"kubernetes.io/projected/94ed578b-910e-4144-a5c9-6d5e7a585b3d-kube-api-access-l4r2n\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.843771 master-0 kubenswrapper[4790]: I1011 11:05:19.843572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-ovsdbserver-nb\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.844671 master-0 kubenswrapper[4790]: I1011 11:05:19.844622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-dns-svc\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.844831 master-0 kubenswrapper[4790]: I1011 11:05:19.844799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-ovsdbserver-nb\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.844831 master-0 kubenswrapper[4790]: I1011 11:05:19.844815 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-edpm\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.845418 master-0 kubenswrapper[4790]: I1011 11:05:19.845363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-dns-swift-storage-0\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.845462 master-0 kubenswrapper[4790]: I1011 11:05:19.845424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-ovsdbserver-sb\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.845768 master-0 kubenswrapper[4790]: I1011 11:05:19.845671 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networkers\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-networkers\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.845823 master-0 kubenswrapper[4790]: I1011 11:05:19.845671 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ed578b-910e-4144-a5c9-6d5e7a585b3d-config\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.863035 master-0 kubenswrapper[4790]: I1011 11:05:19.862973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4r2n\" (UniqueName: \"kubernetes.io/projected/94ed578b-910e-4144-a5c9-6d5e7a585b3d-kube-api-access-l4r2n\") pod \"dnsmasq-dns-f984c5fd9-ljt8j\" (UID: \"94ed578b-910e-4144-a5c9-6d5e7a585b3d\") " pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:19.894329 master-0 kubenswrapper[4790]: I1011 11:05:19.894237 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:20.340500 master-0 kubenswrapper[4790]: I1011 11:05:20.340462 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f984c5fd9-ljt8j"] Oct 11 11:05:20.340916 master-0 kubenswrapper[4790]: W1011 11:05:20.340874 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ed578b_910e_4144_a5c9_6d5e7a585b3d.slice/crio-2b1492c58561187ead8d6fb953bd7524cf9be19d615611ad419076f6d1580f7a WatchSource:0}: Error finding container 2b1492c58561187ead8d6fb953bd7524cf9be19d615611ad419076f6d1580f7a: Status 404 returned error can't find the container with id 2b1492c58561187ead8d6fb953bd7524cf9be19d615611ad419076f6d1580f7a Oct 11 11:05:21.341695 master-0 kubenswrapper[4790]: I1011 11:05:21.341619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" event={"ID":"94ed578b-910e-4144-a5c9-6d5e7a585b3d","Type":"ContainerDied","Data":"8782cb137f123e61f9d257a7ae5d142f54f9b330f8c57f8e1b1a243ba9b0a7d7"} Oct 11 11:05:21.342281 master-0 kubenswrapper[4790]: I1011 11:05:21.341545 4790 generic.go:334] "Generic (PLEG): container finished" podID="94ed578b-910e-4144-a5c9-6d5e7a585b3d" containerID="8782cb137f123e61f9d257a7ae5d142f54f9b330f8c57f8e1b1a243ba9b0a7d7" exitCode=0 Oct 11 11:05:21.342281 master-0 kubenswrapper[4790]: I1011 11:05:21.341796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" event={"ID":"94ed578b-910e-4144-a5c9-6d5e7a585b3d","Type":"ContainerStarted","Data":"2b1492c58561187ead8d6fb953bd7524cf9be19d615611ad419076f6d1580f7a"} Oct 11 11:05:22.352951 master-0 kubenswrapper[4790]: I1011 11:05:22.352887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" event={"ID":"94ed578b-910e-4144-a5c9-6d5e7a585b3d","Type":"ContainerStarted","Data":"ebab5a442a872a623eba61a0b0792dc58e71e175b0b733a278bc74216b925ae7"} Oct 11 11:05:22.353503 master-0 kubenswrapper[4790]: I1011 11:05:22.353094 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:22.391192 master-0 kubenswrapper[4790]: I1011 11:05:22.390964 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" podStartSLOduration=3.390935947 podStartE2EDuration="3.390935947s" podCreationTimestamp="2025-10-11 11:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 11:05:22.389430826 +0000 UTC m=+1598.943891138" watchObservedRunningTime="2025-10-11 11:05:22.390935947 +0000 UTC m=+1598.945396239" Oct 11 11:05:29.898700 master-0 kubenswrapper[4790]: I1011 11:05:29.898634 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f984c5fd9-ljt8j" Oct 11 11:05:30.111072 master-0 kubenswrapper[4790]: I1011 11:05:30.110808 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rgsq2"] Oct 11 11:05:30.121933 master-0 kubenswrapper[4790]: I1011 11:05:30.121826 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rgsq2"] Oct 11 11:05:30.302743 master-0 kubenswrapper[4790]: I1011 11:05:30.302683 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb6dd47-2665-4a3f-8773-2a61034146a3" path="/var/lib/kubelet/pods/7bb6dd47-2665-4a3f-8773-2a61034146a3/volumes" Oct 11 11:05:36.065837 master-0 kubenswrapper[4790]: I1011 11:05:36.065764 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nw6gg"] Oct 11 11:05:36.077932 master-0 kubenswrapper[4790]: I1011 11:05:36.077845 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-r9jnj"] Oct 11 11:05:36.089389 master-0 kubenswrapper[4790]: I1011 11:05:36.089348 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nw6gg"] Oct 11 11:05:36.100831 master-0 kubenswrapper[4790]: I1011 11:05:36.100790 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-r9jnj"] Oct 11 11:05:36.309745 master-0 kubenswrapper[4790]: I1011 11:05:36.309620 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7ae2a3-6802-400c-bbe7-5729052a2c1c" path="/var/lib/kubelet/pods/5b7ae2a3-6802-400c-bbe7-5729052a2c1c/volumes" Oct 11 11:05:36.310821 master-0 kubenswrapper[4790]: I1011 11:05:36.310775 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b0929b8-354d-4de6-9e2d-ac6e11324b10" path="/var/lib/kubelet/pods/8b0929b8-354d-4de6-9e2d-ac6e11324b10/volumes" Oct 11 11:05:41.047205 master-0 kubenswrapper[4790]: I1011 11:05:41.047122 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8a39-account-create-clqqg"] Oct 11 11:05:41.052181 master-0 kubenswrapper[4790]: I1011 11:05:41.052115 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8a39-account-create-clqqg"] Oct 11 11:05:42.307211 master-0 kubenswrapper[4790]: I1011 11:05:42.307126 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37e1fe6-6e89-4407-a40f-cf494a35eccd" path="/var/lib/kubelet/pods/e37e1fe6-6e89-4407-a40f-cf494a35eccd/volumes" Oct 11 11:05:45.456016 master-0 kubenswrapper[4790]: I1011 11:05:45.455821 4790 scope.go:117] "RemoveContainer" containerID="5b50296ba2efac22efde8aae60a1ee89c11a8ace1ff375049f5a9b2bda8f8fc0" Oct 11 11:05:45.477290 master-0 kubenswrapper[4790]: I1011 11:05:45.477203 4790 scope.go:117] "RemoveContainer" containerID="187ba140386fe1bdb57e2319039e5e08987bfe52015b52422eac9cff8a82d276" Oct 11 11:05:45.512438 master-0 kubenswrapper[4790]: I1011 11:05:45.512269 4790 scope.go:117] "RemoveContainer" containerID="ef89d0976a05facc749dfabb5416524787541999145463ce1f713dd9a9f315fb" Oct 11 11:05:45.554072 master-0 kubenswrapper[4790]: I1011 11:05:45.554009 4790 scope.go:117] "RemoveContainer" containerID="ad9c864509e03d2c97f9d070b630e91a99b7a68797b54f4da7ce040e5a112381" Oct 11 11:31:51.336347 master-0 kubenswrapper[4790]: E1011 11:31:51.336244 4790 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.34.10:34182->192.168.34.10:43579: write tcp 192.168.34.10:34182->192.168.34.10:43579: write: broken pipe Oct 11 11:31:51.358771 master-0 kubenswrapper[4790]: I1011 11:31:51.358686 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rtr4h_0bd4ff7d-5743-4ecb-86e8-72a738214533/controller/0.log" Oct 11 11:31:51.370831 master-0 kubenswrapper[4790]: I1011 11:31:51.370770 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rtr4h_0bd4ff7d-5743-4ecb-86e8-72a738214533/kube-rbac-proxy/0.log" Oct 11 11:31:51.414250 master-0 kubenswrapper[4790]: I1011 11:31:51.414184 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/controller/0.log" Oct 11 11:31:51.479035 master-0 kubenswrapper[4790]: I1011 11:31:51.478888 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7f4xb_0510dc20-c216-4f9a-b547-246dfdfc7d6f/nmstate-handler/0.log" Oct 11 11:31:51.540074 master-0 kubenswrapper[4790]: I1011 11:31:51.539739 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-w4js8_ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e/nmstate-metrics/0.log" Oct 11 11:31:51.564994 master-0 kubenswrapper[4790]: I1011 11:31:51.564926 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-w4js8_ca51cef5-fa00-4ea1-b7e6-e6e70bce9a0e/kube-rbac-proxy/0.log" Oct 11 11:31:51.592623 master-0 kubenswrapper[4790]: I1011 11:31:51.592512 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-pnhrj_6b296384-0413-4a1d-825b-530b97e53c9a/nmstate-operator/0.log" Oct 11 11:31:51.612101 master-0 kubenswrapper[4790]: I1011 11:31:51.612031 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-nf8q6_ba695300-f2da-45e9-a825-81d462fc2d37/nmstate-webhook/0.log" Oct 11 11:31:52.145832 master-0 kubenswrapper[4790]: I1011 11:31:52.143160 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-guard-master-0_c6436766-e7b0-471b-acbf-861280191521/guard/0.log" Oct 11 11:31:52.286005 master-0 kubenswrapper[4790]: I1011 11:31:52.285943 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcdctl/0.log" Oct 11 11:31:52.396061 master-0 kubenswrapper[4790]: I1011 11:31:52.395898 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/frr/0.log" Oct 11 11:31:52.409457 master-0 kubenswrapper[4790]: I1011 11:31:52.409395 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/reloader/0.log" Oct 11 11:31:52.423056 master-0 kubenswrapper[4790]: I1011 11:31:52.422973 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/frr-metrics/0.log" Oct 11 11:31:52.438468 master-0 kubenswrapper[4790]: I1011 11:31:52.438419 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/kube-rbac-proxy/0.log" Oct 11 11:31:52.456688 master-0 kubenswrapper[4790]: I1011 11:31:52.456572 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/kube-rbac-proxy-frr/0.log" Oct 11 11:31:52.456688 master-0 kubenswrapper[4790]: I1011 11:31:52.456640 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd/0.log" Oct 11 11:31:52.467528 master-0 kubenswrapper[4790]: I1011 11:31:52.467189 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/cp-frr-files/0.log" Oct 11 11:31:52.480403 master-0 kubenswrapper[4790]: I1011 11:31:52.480361 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/cp-reloader/0.log" Oct 11 11:31:52.483077 master-0 kubenswrapper[4790]: I1011 11:31:52.483029 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-metrics/0.log" Oct 11 11:31:52.492478 master-0 kubenswrapper[4790]: I1011 11:31:52.492434 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/cp-metrics/0.log" Oct 11 11:31:52.513027 master-0 kubenswrapper[4790]: I1011 11:31:52.512981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-readyz/0.log" Oct 11 11:31:52.533682 master-0 kubenswrapper[4790]: I1011 11:31:52.533573 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-rev/0.log" Oct 11 11:31:52.560358 master-0 kubenswrapper[4790]: I1011 11:31:52.560293 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/setup/0.log" Oct 11 11:31:52.592784 master-0 kubenswrapper[4790]: I1011 11:31:52.592724 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-ensure-env-vars/0.log" Oct 11 11:31:52.617575 master-0 kubenswrapper[4790]: I1011 11:31:52.617490 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-resources-copy/0.log" Oct 11 11:31:53.437571 master-0 kubenswrapper[4790]: I1011 11:31:53.437504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-10-master-0_527d9cd7-412d-4afb-9212-c8697426a964/installer/0.log" Oct 11 11:31:53.664914 master-0 kubenswrapper[4790]: I1011 11:31:53.664626 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-8-master-0_a3934355-bb61-4316-b164-05294e12906a/installer/0.log" Oct 11 11:31:53.686487 master-0 kubenswrapper[4790]: I1011 11:31:53.686413 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_revision-pruner-10-master-0_e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3/pruner/0.log" Oct 11 11:31:54.124805 master-0 kubenswrapper[4790]: I1011 11:31:54.124488 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6fccd5ccc-khqd5_a6689745-4f25-4776-9f5c-6bfd7abe62a8/oauth-openshift/0.log" Oct 11 11:31:54.941139 master-0 kubenswrapper[4790]: I1011 11:31:54.940951 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fg6sc/master-0-debug-7hncs"] Oct 11 11:31:54.943878 master-0 kubenswrapper[4790]: I1011 11:31:54.943833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:54.947509 master-0 kubenswrapper[4790]: I1011 11:31:54.947450 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fg6sc"/"openshift-service-ca.crt" Oct 11 11:31:54.947627 master-0 kubenswrapper[4790]: I1011 11:31:54.947560 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fg6sc"/"kube-root-ca.crt" Oct 11 11:31:55.090082 master-0 kubenswrapper[4790]: I1011 11:31:55.089996 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkxr\" (UniqueName: \"kubernetes.io/projected/acb7fe6d-dadf-4951-9d42-65d05d41ba73-kube-api-access-jrkxr\") pod \"master-0-debug-7hncs\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.090082 master-0 kubenswrapper[4790]: I1011 11:31:55.090058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acb7fe6d-dadf-4951-9d42-65d05d41ba73-host\") pod \"master-0-debug-7hncs\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.192088 master-0 kubenswrapper[4790]: I1011 11:31:55.191897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkxr\" (UniqueName: \"kubernetes.io/projected/acb7fe6d-dadf-4951-9d42-65d05d41ba73-kube-api-access-jrkxr\") pod \"master-0-debug-7hncs\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.192088 master-0 kubenswrapper[4790]: I1011 11:31:55.191976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acb7fe6d-dadf-4951-9d42-65d05d41ba73-host\") pod \"master-0-debug-7hncs\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.192385 master-0 kubenswrapper[4790]: I1011 11:31:55.192209 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acb7fe6d-dadf-4951-9d42-65d05d41ba73-host\") pod \"master-0-debug-7hncs\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.213086 master-0 kubenswrapper[4790]: I1011 11:31:55.213000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkxr\" (UniqueName: \"kubernetes.io/projected/acb7fe6d-dadf-4951-9d42-65d05d41ba73-kube-api-access-jrkxr\") pod \"master-0-debug-7hncs\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.262852 master-0 kubenswrapper[4790]: I1011 11:31:55.262770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:31:55.290606 master-0 kubenswrapper[4790]: W1011 11:31:55.290531 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb7fe6d_dadf_4951_9d42_65d05d41ba73.slice/crio-09c503d435fed13d4ce8ce4cf4c859f62d4e994980580099cc22009b20c5bebc WatchSource:0}: Error finding container 09c503d435fed13d4ce8ce4cf4c859f62d4e994980580099cc22009b20c5bebc: Status 404 returned error can't find the container with id 09c503d435fed13d4ce8ce4cf4c859f62d4e994980580099cc22009b20c5bebc Oct 11 11:31:55.291525 master-0 kubenswrapper[4790]: I1011 11:31:55.291481 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-54x4w_7359c204-2acb-4c3b-b05f-2a124f3862fb/frr-k8s-webhook-server/0.log" Oct 11 11:31:55.296715 master-0 kubenswrapper[4790]: I1011 11:31:55.296667 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 11:31:55.323384 master-0 kubenswrapper[4790]: I1011 11:31:55.323314 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56b566d9f-hppvq_01ae1cda-0c92-4f86-bff5-90e6cbb3881e/manager/0.log" Oct 11 11:31:55.342178 master-0 kubenswrapper[4790]: I1011 11:31:55.340399 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-84d69c968c-btbcm_3d4bad0b-955f-4d0e-8849-8257c50682cb/webhook-server/0.log" Oct 11 11:31:56.253921 master-0 kubenswrapper[4790]: I1011 11:31:56.253825 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" event={"ID":"acb7fe6d-dadf-4951-9d42-65d05d41ba73","Type":"ContainerStarted","Data":"09c503d435fed13d4ce8ce4cf4c859f62d4e994980580099cc22009b20c5bebc"} Oct 11 11:31:56.355188 master-0 kubenswrapper[4790]: I1011 11:31:56.355113 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf"] Oct 11 11:31:56.357000 master-0 kubenswrapper[4790]: I1011 11:31:56.356970 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.363166 master-0 kubenswrapper[4790]: I1011 11:31:56.363076 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf"] Oct 11 11:31:56.518679 master-0 kubenswrapper[4790]: I1011 11:31:56.518439 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwfd8\" (UniqueName: \"kubernetes.io/projected/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-kube-api-access-kwfd8\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.518679 master-0 kubenswrapper[4790]: I1011 11:31:56.518536 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-lib-modules\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.518679 master-0 kubenswrapper[4790]: I1011 11:31:56.518627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-podres\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.519156 master-0 kubenswrapper[4790]: I1011 11:31:56.519103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-proc\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.519470 master-0 kubenswrapper[4790]: I1011 11:31:56.519398 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-sys\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.623201 master-0 kubenswrapper[4790]: I1011 11:31:56.623001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-podres\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.623939 master-0 kubenswrapper[4790]: I1011 11:31:56.623232 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-proc\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.623939 master-0 kubenswrapper[4790]: I1011 11:31:56.623397 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-sys\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.623939 master-0 kubenswrapper[4790]: I1011 11:31:56.623468 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-proc\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.623939 master-0 kubenswrapper[4790]: I1011 11:31:56.623247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-podres\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.623939 master-0 kubenswrapper[4790]: I1011 11:31:56.623561 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwfd8\" (UniqueName: \"kubernetes.io/projected/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-kube-api-access-kwfd8\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.624196 master-0 kubenswrapper[4790]: I1011 11:31:56.623959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-lib-modules\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.624552 master-0 kubenswrapper[4790]: I1011 11:31:56.624508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-sys\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.624843 master-0 kubenswrapper[4790]: I1011 11:31:56.624750 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-lib-modules\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.648527 master-0 kubenswrapper[4790]: I1011 11:31:56.648353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwfd8\" (UniqueName: \"kubernetes.io/projected/b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd-kube-api-access-kwfd8\") pod \"perf-node-gather-daemonset-scbrf\" (UID: \"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd\") " pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.689411 master-0 kubenswrapper[4790]: I1011 11:31:56.689342 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:31:56.937672 master-0 kubenswrapper[4790]: I1011 11:31:56.937603 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8n7ld_7d3e23ec-dfa6-46d4-bf57-4e89ee459be5/speaker/0.log" Oct 11 11:31:56.947961 master-0 kubenswrapper[4790]: I1011 11:31:56.947920 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8n7ld_7d3e23ec-dfa6-46d4-bf57-4e89ee459be5/kube-rbac-proxy/0.log" Oct 11 11:31:57.208652 master-0 kubenswrapper[4790]: I1011 11:31:57.208573 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf"] Oct 11 11:31:57.212270 master-0 kubenswrapper[4790]: W1011 11:31:57.212192 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb9e5dfa9_1ffa_4e49_b8ee_d4b161c679dd.slice/crio-310fe9c968f1834674155ef4bab4f64bf1043980301e0db1ea56c008309ee924 WatchSource:0}: Error finding container 310fe9c968f1834674155ef4bab4f64bf1043980301e0db1ea56c008309ee924: Status 404 returned error can't find the container with id 310fe9c968f1834674155ef4bab4f64bf1043980301e0db1ea56c008309ee924 Oct 11 11:31:57.268506 master-0 kubenswrapper[4790]: I1011 11:31:57.268410 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" event={"ID":"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd","Type":"ContainerStarted","Data":"310fe9c968f1834674155ef4bab4f64bf1043980301e0db1ea56c008309ee924"} Oct 11 11:31:57.484030 master-0 kubenswrapper[4790]: I1011 11:31:57.483883 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-68f4c55ff4-nk86r_bc183705-096c-4af1-adf7-d3cd0e4532e1/oauth-apiserver/0.log" Oct 11 11:31:57.506232 master-0 kubenswrapper[4790]: I1011 11:31:57.506126 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-68f4c55ff4-nk86r_bc183705-096c-4af1-adf7-d3cd0e4532e1/fix-audit-permissions/0.log" Oct 11 11:31:58.288016 master-0 kubenswrapper[4790]: I1011 11:31:58.287930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" event={"ID":"b9e5dfa9-1ffa-4e49-b8ee-d4b161c679dd","Type":"ContainerStarted","Data":"b9ab8202a592218ee6d308180d1c078e70c3637a06447a9c00d988c40f228b34"} Oct 11 11:31:58.289348 master-0 kubenswrapper[4790]: I1011 11:31:58.288239 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:32:04.354891 master-0 kubenswrapper[4790]: I1011 11:32:04.354595 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" event={"ID":"acb7fe6d-dadf-4951-9d42-65d05d41ba73","Type":"ContainerStarted","Data":"97cb9666a75fcb998f11455fa3686aebb58058ffc5ab5c048ecfc3bf20463b14"} Oct 11 11:32:04.399744 master-0 kubenswrapper[4790]: I1011 11:32:04.399600 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" podStartSLOduration=2.028791192 podStartE2EDuration="10.399563598s" podCreationTimestamp="2025-10-11 11:31:54 +0000 UTC" firstStartedPulling="2025-10-11 11:31:55.296624217 +0000 UTC m=+3191.851084509" lastFinishedPulling="2025-10-11 11:32:03.667396593 +0000 UTC m=+3200.221856915" observedRunningTime="2025-10-11 11:32:04.394547043 +0000 UTC m=+3200.949007385" watchObservedRunningTime="2025-10-11 11:32:04.399563598 +0000 UTC m=+3200.954023930" Oct 11 11:32:04.401803 master-0 kubenswrapper[4790]: I1011 11:32:04.401675 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" podStartSLOduration=8.401658855 podStartE2EDuration="8.401658855s" podCreationTimestamp="2025-10-11 11:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 11:31:58.313856266 +0000 UTC m=+3194.868316618" watchObservedRunningTime="2025-10-11 11:32:04.401658855 +0000 UTC m=+3200.956119177" Oct 11 11:32:06.726232 master-0 kubenswrapper[4790]: I1011 11:32:06.726084 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fg6sc/perf-node-gather-daemonset-scbrf" Oct 11 11:32:10.157522 master-0 kubenswrapper[4790]: I1011 11:32:10.157430 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xznwp_9c1b597b-dba4-4011-9acd-e6d40ed8aea4/dns/0.log" Oct 11 11:32:10.177572 master-0 kubenswrapper[4790]: I1011 11:32:10.177514 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xznwp_9c1b597b-dba4-4011-9acd-e6d40ed8aea4/kube-rbac-proxy/0.log" Oct 11 11:32:10.205061 master-0 kubenswrapper[4790]: I1011 11:32:10.204962 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5kghv_00e9cb61-65c4-4e6a-bb0c-2428529c63bf/dns-node-resolver/0.log" Oct 11 11:32:11.685244 master-0 kubenswrapper[4790]: I1011 11:32:11.685161 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-guard-master-0_c6436766-e7b0-471b-acbf-861280191521/guard/0.log" Oct 11 11:32:11.776768 master-0 kubenswrapper[4790]: I1011 11:32:11.776659 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcdctl/0.log" Oct 11 11:32:11.953359 master-0 kubenswrapper[4790]: I1011 11:32:11.953179 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd/0.log" Oct 11 11:32:11.974792 master-0 kubenswrapper[4790]: I1011 11:32:11.974724 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-metrics/0.log" Oct 11 11:32:12.003019 master-0 kubenswrapper[4790]: I1011 11:32:12.002961 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-readyz/0.log" Oct 11 11:32:12.024334 master-0 kubenswrapper[4790]: I1011 11:32:12.024284 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-rev/0.log" Oct 11 11:32:12.046957 master-0 kubenswrapper[4790]: I1011 11:32:12.046888 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/setup/0.log" Oct 11 11:32:12.069318 master-0 kubenswrapper[4790]: I1011 11:32:12.069246 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-ensure-env-vars/0.log" Oct 11 11:32:12.093491 master-0 kubenswrapper[4790]: I1011 11:32:12.093396 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_14286286be88b59efc7cfc15eca1cc38/etcd-resources-copy/0.log" Oct 11 11:32:12.886695 master-0 kubenswrapper[4790]: I1011 11:32:12.886589 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-10-master-0_527d9cd7-412d-4afb-9212-c8697426a964/installer/0.log" Oct 11 11:32:13.101680 master-0 kubenswrapper[4790]: I1011 11:32:13.101583 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-8-master-0_a3934355-bb61-4316-b164-05294e12906a/installer/0.log" Oct 11 11:32:13.127350 master-0 kubenswrapper[4790]: I1011 11:32:13.127287 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_revision-pruner-10-master-0_e35e5ca9-d4d4-47f2-a2d0-217f9ac77ba3/pruner/0.log" Oct 11 11:32:14.343571 master-0 kubenswrapper[4790]: I1011 11:32:14.343504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g99cx_4e2d32e6-3363-4389-ad6a-cfd917e568d2/node-ca/0.log" Oct 11 11:32:15.877808 master-0 kubenswrapper[4790]: I1011 11:32:15.877733 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6xnjz_df4afece-b896-4fea-8b5f-ccebc400ee9f/serve-healthcheck-canary/0.log" Oct 11 11:32:18.795955 master-0 kubenswrapper[4790]: I1011 11:32:18.795823 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/alertmanager/0.log" Oct 11 11:32:18.826221 master-0 kubenswrapper[4790]: I1011 11:32:18.825634 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/config-reloader/0.log" Oct 11 11:32:18.918425 master-0 kubenswrapper[4790]: I1011 11:32:18.918343 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/kube-rbac-proxy-web/0.log" Oct 11 11:32:18.936108 master-0 kubenswrapper[4790]: I1011 11:32:18.935378 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/kube-rbac-proxy/0.log" Oct 11 11:32:18.963359 master-0 kubenswrapper[4790]: I1011 11:32:18.963010 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/kube-rbac-proxy-metric/0.log" Oct 11 11:32:18.979569 master-0 kubenswrapper[4790]: I1011 11:32:18.979503 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/prom-label-proxy/0.log" Oct 11 11:32:19.003452 master-0 kubenswrapper[4790]: I1011 11:32:19.003379 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e3e6a069-f9e0-417c-9226-5ef929699b39/init-config-reloader/0.log" Oct 11 11:32:19.471463 master-0 kubenswrapper[4790]: I1011 11:32:19.471363 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7d46fcc5c6-n88q4_1254ac82-5820-431e-baeb-3ae7d7997b38/metrics-server/0.log" Oct 11 11:32:19.498911 master-0 kubenswrapper[4790]: I1011 11:32:19.498827 4790 generic.go:334] "Generic (PLEG): container finished" podID="acb7fe6d-dadf-4951-9d42-65d05d41ba73" containerID="97cb9666a75fcb998f11455fa3686aebb58058ffc5ab5c048ecfc3bf20463b14" exitCode=0 Oct 11 11:32:19.498911 master-0 kubenswrapper[4790]: I1011 11:32:19.498905 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" event={"ID":"acb7fe6d-dadf-4951-9d42-65d05d41ba73","Type":"ContainerDied","Data":"97cb9666a75fcb998f11455fa3686aebb58058ffc5ab5c048ecfc3bf20463b14"} Oct 11 11:32:19.691731 master-0 kubenswrapper[4790]: I1011 11:32:19.691653 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l66k2_7d9f4c3d-57bd-49f6-94f2-47670b385318/node-exporter/0.log" Oct 11 11:32:19.717125 master-0 kubenswrapper[4790]: I1011 11:32:19.717060 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l66k2_7d9f4c3d-57bd-49f6-94f2-47670b385318/kube-rbac-proxy/0.log" Oct 11 11:32:19.745164 master-0 kubenswrapper[4790]: I1011 11:32:19.745032 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l66k2_7d9f4c3d-57bd-49f6-94f2-47670b385318/init-textfile/0.log" Oct 11 11:32:19.962266 master-0 kubenswrapper[4790]: I1011 11:32:19.962180 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/prometheus/0.log" Oct 11 11:32:19.981731 master-0 kubenswrapper[4790]: I1011 11:32:19.981640 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/config-reloader/0.log" Oct 11 11:32:20.006199 master-0 kubenswrapper[4790]: I1011 11:32:20.006038 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/thanos-sidecar/0.log" Oct 11 11:32:20.140974 master-0 kubenswrapper[4790]: I1011 11:32:20.140907 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/kube-rbac-proxy-web/0.log" Oct 11 11:32:20.165265 master-0 kubenswrapper[4790]: I1011 11:32:20.165221 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/kube-rbac-proxy/0.log" Oct 11 11:32:20.195692 master-0 kubenswrapper[4790]: I1011 11:32:20.195638 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/kube-rbac-proxy-thanos/0.log" Oct 11 11:32:20.219624 master-0 kubenswrapper[4790]: I1011 11:32:20.219558 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_27006098-2092-43c6-97f8-0219e7fc4b81/init-config-reloader/0.log" Oct 11 11:32:20.594391 master-0 kubenswrapper[4790]: I1011 11:32:20.594329 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:32:20.662825 master-0 kubenswrapper[4790]: I1011 11:32:20.662664 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fg6sc/master-0-debug-7hncs"] Oct 11 11:32:20.667488 master-0 kubenswrapper[4790]: I1011 11:32:20.667427 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fg6sc/master-0-debug-7hncs"] Oct 11 11:32:20.699141 master-0 kubenswrapper[4790]: I1011 11:32:20.699054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acb7fe6d-dadf-4951-9d42-65d05d41ba73-host\") pod \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " Oct 11 11:32:20.699420 master-0 kubenswrapper[4790]: I1011 11:32:20.699261 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acb7fe6d-dadf-4951-9d42-65d05d41ba73-host" (OuterVolumeSpecName: "host") pod "acb7fe6d-dadf-4951-9d42-65d05d41ba73" (UID: "acb7fe6d-dadf-4951-9d42-65d05d41ba73"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 11:32:20.699639 master-0 kubenswrapper[4790]: I1011 11:32:20.699587 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrkxr\" (UniqueName: \"kubernetes.io/projected/acb7fe6d-dadf-4951-9d42-65d05d41ba73-kube-api-access-jrkxr\") pod \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\" (UID: \"acb7fe6d-dadf-4951-9d42-65d05d41ba73\") " Oct 11 11:32:20.700919 master-0 kubenswrapper[4790]: I1011 11:32:20.700871 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acb7fe6d-dadf-4951-9d42-65d05d41ba73-host\") on node \"master-0\" DevicePath \"\"" Oct 11 11:32:20.709018 master-0 kubenswrapper[4790]: I1011 11:32:20.708960 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb7fe6d-dadf-4951-9d42-65d05d41ba73-kube-api-access-jrkxr" (OuterVolumeSpecName: "kube-api-access-jrkxr") pod "acb7fe6d-dadf-4951-9d42-65d05d41ba73" (UID: "acb7fe6d-dadf-4951-9d42-65d05d41ba73"). InnerVolumeSpecName "kube-api-access-jrkxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 11:32:20.750248 master-0 kubenswrapper[4790]: I1011 11:32:20.750068 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f646dd4d8-qxd8w_06012e2a-b507-48ad-9740-2c3cb3af5bdf/thanos-query/0.log" Oct 11 11:32:20.803000 master-0 kubenswrapper[4790]: I1011 11:32:20.802927 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrkxr\" (UniqueName: \"kubernetes.io/projected/acb7fe6d-dadf-4951-9d42-65d05d41ba73-kube-api-access-jrkxr\") on node \"master-0\" DevicePath \"\"" Oct 11 11:32:20.835420 master-0 kubenswrapper[4790]: I1011 11:32:20.835359 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f646dd4d8-qxd8w_06012e2a-b507-48ad-9740-2c3cb3af5bdf/kube-rbac-proxy-web/0.log" Oct 11 11:32:20.851569 master-0 kubenswrapper[4790]: I1011 11:32:20.851538 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f646dd4d8-qxd8w_06012e2a-b507-48ad-9740-2c3cb3af5bdf/kube-rbac-proxy/0.log" Oct 11 11:32:20.876621 master-0 kubenswrapper[4790]: I1011 11:32:20.876563 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f646dd4d8-qxd8w_06012e2a-b507-48ad-9740-2c3cb3af5bdf/prom-label-proxy/0.log" Oct 11 11:32:20.898191 master-0 kubenswrapper[4790]: I1011 11:32:20.898157 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f646dd4d8-qxd8w_06012e2a-b507-48ad-9740-2c3cb3af5bdf/kube-rbac-proxy-rules/0.log" Oct 11 11:32:20.921835 master-0 kubenswrapper[4790]: I1011 11:32:20.921646 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f646dd4d8-qxd8w_06012e2a-b507-48ad-9740-2c3cb3af5bdf/kube-rbac-proxy-metrics/0.log" Oct 11 11:32:21.520024 master-0 kubenswrapper[4790]: I1011 11:32:21.519920 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c503d435fed13d4ce8ce4cf4c859f62d4e994980580099cc22009b20c5bebc" Oct 11 11:32:21.521095 master-0 kubenswrapper[4790]: I1011 11:32:21.520050 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-7hncs" Oct 11 11:32:21.914130 master-0 kubenswrapper[4790]: I1011 11:32:21.913971 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fg6sc/master-0-debug-9jp8d"] Oct 11 11:32:21.914604 master-0 kubenswrapper[4790]: E1011 11:32:21.914378 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb7fe6d-dadf-4951-9d42-65d05d41ba73" containerName="container-00" Oct 11 11:32:21.914604 master-0 kubenswrapper[4790]: I1011 11:32:21.914400 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb7fe6d-dadf-4951-9d42-65d05d41ba73" containerName="container-00" Oct 11 11:32:21.914604 master-0 kubenswrapper[4790]: I1011 11:32:21.914562 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb7fe6d-dadf-4951-9d42-65d05d41ba73" containerName="container-00" Oct 11 11:32:21.915359 master-0 kubenswrapper[4790]: I1011 11:32:21.915322 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.028658 master-0 kubenswrapper[4790]: I1011 11:32:22.028566 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbss\" (UniqueName: \"kubernetes.io/projected/99bdb546-1d3e-4fa2-83c2-e559b974621a-kube-api-access-8wbss\") pod \"master-0-debug-9jp8d\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.028658 master-0 kubenswrapper[4790]: I1011 11:32:22.028665 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99bdb546-1d3e-4fa2-83c2-e559b974621a-host\") pod \"master-0-debug-9jp8d\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.133074 master-0 kubenswrapper[4790]: I1011 11:32:22.132966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbss\" (UniqueName: \"kubernetes.io/projected/99bdb546-1d3e-4fa2-83c2-e559b974621a-kube-api-access-8wbss\") pod \"master-0-debug-9jp8d\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.133514 master-0 kubenswrapper[4790]: I1011 11:32:22.133367 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99bdb546-1d3e-4fa2-83c2-e559b974621a-host\") pod \"master-0-debug-9jp8d\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.133625 master-0 kubenswrapper[4790]: I1011 11:32:22.133521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99bdb546-1d3e-4fa2-83c2-e559b974621a-host\") pod \"master-0-debug-9jp8d\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.154793 master-0 kubenswrapper[4790]: I1011 11:32:22.154332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbss\" (UniqueName: \"kubernetes.io/projected/99bdb546-1d3e-4fa2-83c2-e559b974621a-kube-api-access-8wbss\") pod \"master-0-debug-9jp8d\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.240490 master-0 kubenswrapper[4790]: I1011 11:32:22.240310 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:22.284171 master-0 kubenswrapper[4790]: W1011 11:32:22.284074 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99bdb546_1d3e_4fa2_83c2_e559b974621a.slice/crio-158fd73cc9b9682b8472c8cd1c337a61436987a7d0378ee49445bc8e3a2f86d9 WatchSource:0}: Error finding container 158fd73cc9b9682b8472c8cd1c337a61436987a7d0378ee49445bc8e3a2f86d9: Status 404 returned error can't find the container with id 158fd73cc9b9682b8472c8cd1c337a61436987a7d0378ee49445bc8e3a2f86d9 Oct 11 11:32:22.305940 master-0 kubenswrapper[4790]: I1011 11:32:22.305863 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb7fe6d-dadf-4951-9d42-65d05d41ba73" path="/var/lib/kubelet/pods/acb7fe6d-dadf-4951-9d42-65d05d41ba73/volumes" Oct 11 11:32:22.530829 master-0 kubenswrapper[4790]: I1011 11:32:22.530649 4790 generic.go:334] "Generic (PLEG): container finished" podID="99bdb546-1d3e-4fa2-83c2-e559b974621a" containerID="4b15916ae1c47dcd2f111c279ca5484dea5119eb334e236bfe4a11ac7abf19d2" exitCode=1 Oct 11 11:32:22.530829 master-0 kubenswrapper[4790]: I1011 11:32:22.530697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" event={"ID":"99bdb546-1d3e-4fa2-83c2-e559b974621a","Type":"ContainerDied","Data":"4b15916ae1c47dcd2f111c279ca5484dea5119eb334e236bfe4a11ac7abf19d2"} Oct 11 11:32:22.530829 master-0 kubenswrapper[4790]: I1011 11:32:22.530751 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" event={"ID":"99bdb546-1d3e-4fa2-83c2-e559b974621a","Type":"ContainerStarted","Data":"158fd73cc9b9682b8472c8cd1c337a61436987a7d0378ee49445bc8e3a2f86d9"} Oct 11 11:32:22.599900 master-0 kubenswrapper[4790]: I1011 11:32:22.599809 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fg6sc/master-0-debug-9jp8d"] Oct 11 11:32:22.606732 master-0 kubenswrapper[4790]: I1011 11:32:22.606672 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fg6sc/master-0-debug-9jp8d"] Oct 11 11:32:23.267924 master-0 kubenswrapper[4790]: I1011 11:32:23.267845 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rtr4h_0bd4ff7d-5743-4ecb-86e8-72a738214533/controller/0.log" Oct 11 11:32:23.286984 master-0 kubenswrapper[4790]: I1011 11:32:23.286928 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rtr4h_0bd4ff7d-5743-4ecb-86e8-72a738214533/kube-rbac-proxy/0.log" Oct 11 11:32:23.326257 master-0 kubenswrapper[4790]: I1011 11:32:23.326175 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/controller/0.log" Oct 11 11:32:23.643860 master-0 kubenswrapper[4790]: I1011 11:32:23.643803 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:23.765251 master-0 kubenswrapper[4790]: I1011 11:32:23.765166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99bdb546-1d3e-4fa2-83c2-e559b974621a-host\") pod \"99bdb546-1d3e-4fa2-83c2-e559b974621a\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " Oct 11 11:32:23.765517 master-0 kubenswrapper[4790]: I1011 11:32:23.765293 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bdb546-1d3e-4fa2-83c2-e559b974621a-host" (OuterVolumeSpecName: "host") pod "99bdb546-1d3e-4fa2-83c2-e559b974621a" (UID: "99bdb546-1d3e-4fa2-83c2-e559b974621a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 11:32:23.765552 master-0 kubenswrapper[4790]: I1011 11:32:23.765533 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbss\" (UniqueName: \"kubernetes.io/projected/99bdb546-1d3e-4fa2-83c2-e559b974621a-kube-api-access-8wbss\") pod \"99bdb546-1d3e-4fa2-83c2-e559b974621a\" (UID: \"99bdb546-1d3e-4fa2-83c2-e559b974621a\") " Oct 11 11:32:23.766484 master-0 kubenswrapper[4790]: I1011 11:32:23.766436 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99bdb546-1d3e-4fa2-83c2-e559b974621a-host\") on node \"master-0\" DevicePath \"\"" Oct 11 11:32:23.769054 master-0 kubenswrapper[4790]: I1011 11:32:23.768982 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bdb546-1d3e-4fa2-83c2-e559b974621a-kube-api-access-8wbss" (OuterVolumeSpecName: "kube-api-access-8wbss") pod "99bdb546-1d3e-4fa2-83c2-e559b974621a" (UID: "99bdb546-1d3e-4fa2-83c2-e559b974621a"). InnerVolumeSpecName "kube-api-access-8wbss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 11:32:23.868922 master-0 kubenswrapper[4790]: I1011 11:32:23.868842 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbss\" (UniqueName: \"kubernetes.io/projected/99bdb546-1d3e-4fa2-83c2-e559b974621a-kube-api-access-8wbss\") on node \"master-0\" DevicePath \"\"" Oct 11 11:32:24.303222 master-0 kubenswrapper[4790]: I1011 11:32:24.303176 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bdb546-1d3e-4fa2-83c2-e559b974621a" path="/var/lib/kubelet/pods/99bdb546-1d3e-4fa2-83c2-e559b974621a/volumes" Oct 11 11:32:24.374627 master-0 kubenswrapper[4790]: I1011 11:32:24.374538 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/frr/0.log" Oct 11 11:32:24.401126 master-0 kubenswrapper[4790]: I1011 11:32:24.401061 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/reloader/0.log" Oct 11 11:32:24.419596 master-0 kubenswrapper[4790]: I1011 11:32:24.419496 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/frr-metrics/0.log" Oct 11 11:32:24.447737 master-0 kubenswrapper[4790]: I1011 11:32:24.447647 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/kube-rbac-proxy/0.log" Oct 11 11:32:24.470782 master-0 kubenswrapper[4790]: I1011 11:32:24.470743 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/kube-rbac-proxy-frr/0.log" Oct 11 11:32:24.490505 master-0 kubenswrapper[4790]: I1011 11:32:24.490103 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/cp-frr-files/0.log" Oct 11 11:32:24.518358 master-0 kubenswrapper[4790]: I1011 11:32:24.518295 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/cp-reloader/0.log" Oct 11 11:32:24.543090 master-0 kubenswrapper[4790]: I1011 11:32:24.543039 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xkrb_bd096860-a678-4b71-a23d-70ecd6b79a0d/cp-metrics/0.log" Oct 11 11:32:24.551119 master-0 kubenswrapper[4790]: I1011 11:32:24.551064 4790 scope.go:117] "RemoveContainer" containerID="4b15916ae1c47dcd2f111c279ca5484dea5119eb334e236bfe4a11ac7abf19d2" Oct 11 11:32:24.551354 master-0 kubenswrapper[4790]: I1011 11:32:24.551231 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg6sc/master-0-debug-9jp8d" Oct 11 11:32:27.847077 master-0 kubenswrapper[4790]: I1011 11:32:27.847007 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-54x4w_7359c204-2acb-4c3b-b05f-2a124f3862fb/frr-k8s-webhook-server/0.log" Oct 11 11:32:27.895735 master-0 kubenswrapper[4790]: I1011 11:32:27.895657 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56b566d9f-hppvq_01ae1cda-0c92-4f86-bff5-90e6cbb3881e/manager/0.log" Oct 11 11:32:27.917200 master-0 kubenswrapper[4790]: I1011 11:32:27.917127 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-84d69c968c-btbcm_3d4bad0b-955f-4d0e-8849-8257c50682cb/webhook-server/0.log" Oct 11 11:32:29.416346 master-0 kubenswrapper[4790]: I1011 11:32:29.416197 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8n7ld_7d3e23ec-dfa6-46d4-bf57-4e89ee459be5/speaker/0.log" Oct 11 11:32:29.456039 master-0 kubenswrapper[4790]: I1011 11:32:29.455963 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8n7ld_7d3e23ec-dfa6-46d4-bf57-4e89ee459be5/kube-rbac-proxy/0.log" Oct 11 11:32:32.259541 master-0 kubenswrapper[4790]: I1011 11:32:32.259459 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-85bvx_bfe05233-94bf-4e16-8c7e-321435ba7f00/tuned/0.log" Oct 11 11:32:34.568777 master-0 kubenswrapper[4790]: I1011 11:32:34.568634 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_e7063ccc-c150-41d0-9285-8a8ca00aa417/installer/0.log" Oct 11 11:32:34.681577 master-0 kubenswrapper[4790]: I1011 11:32:34.681509 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-guard-master-0_acfb978c-45a9-4081-9d1e-3751eea1b483/guard/0.log" Oct 11 11:32:35.264799 master-0 kubenswrapper[4790]: I1011 11:32:35.264699 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_08bb0ac7b01a53ae0dcb90ce8b66efa1/kube-apiserver/0.log" Oct 11 11:32:35.287020 master-0 kubenswrapper[4790]: I1011 11:32:35.286940 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_08bb0ac7b01a53ae0dcb90ce8b66efa1/kube-apiserver-cert-syncer/0.log" Oct 11 11:32:35.314227 master-0 kubenswrapper[4790]: I1011 11:32:35.314139 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_08bb0ac7b01a53ae0dcb90ce8b66efa1/kube-apiserver-cert-regeneration-controller/0.log" Oct 11 11:32:35.329430 master-0 kubenswrapper[4790]: I1011 11:32:35.329318 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_08bb0ac7b01a53ae0dcb90ce8b66efa1/kube-apiserver-insecure-readyz/0.log" Oct 11 11:32:35.350967 master-0 kubenswrapper[4790]: I1011 11:32:35.350902 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_08bb0ac7b01a53ae0dcb90ce8b66efa1/kube-apiserver-check-endpoints/0.log" Oct 11 11:32:35.369499 master-0 kubenswrapper[4790]: I1011 11:32:35.369415 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_08bb0ac7b01a53ae0dcb90ce8b66efa1/setup/0.log" Oct 11 11:32:36.189789 master-0 kubenswrapper[4790]: I1011 11:32:36.189687 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_revision-pruner-6-master-0_14a0545c-12d2-49a0-be5e-17f472bac134/pruner/0.log"